![]() METHOD, SYSTEM, AND, NON-TEMPORARY TANGIBLE COMPUTER-LEABLE MEDIA
专利摘要:
method, system, and non-temporary, tangible computer-readable media. the present description relates to methods and systems for obtaining image information from an organism that includes an optical data set; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth index, wherein the image information includes at least one of: (a) visible image data obtained from an image sensor and non-visible image data obtained from of the image sensor, and (b) a set of image data from at least two image capture devices, wherein the at least two image capture devices capture the image data set from at least two positions. 公开号:BR112015024105B1 申请号:R112015024105-0 申请日:2014-03-17 公开日:2022-01-04 发明作者:Morio Ogura;Hirofumi Sumi 申请人:Sony Corporation; IPC主号:
专利说明:
<CROSS REFERENCE TO RELATED ORDERS> [001] This application claims the benefit of Japanese Priority Patent Application JP 2013-062017 filed March 25, 2013, the entire contents of which are incorporated herein by reference. Field of Invention [002] The technology currently described refers to an information processing system, an information processing method of the information processing system, an image processing device and an image processing method, and a program and, in particular, an information processing system, an information processing method of the information processing system, an image processing device and an image processing method, and a program enabling the calculation of an appropriate agricultural production growth rate and an appropriate forecasted harvest time. Fundamentals of the Invention [003] In satellite remote sensing in which a growing state and a harvest season of agricultural production are estimated by sensing reflected light (near-infrared light) from plants using a sensor mounted on a space satellite, it is difficult to acquire data under a night sky or clouds, and it takes several days for satellite data to be revealed, so real-time information is difficult to obtain. Furthermore, since a satellite travels in a loop and, thus, the reception of information from the same point depends on the satellite cycle, coarse information of a wide range is obtained, at the same time as it needs information from a region. narrow is hard to get. [004] Furthermore, in close remote sensing that uses a ground-mounted sensor, the distance from a target to the sensor is short and therefore there are advantages in that the sensing is less affected by the atmosphere than in remote sensing by satellite, target data can only be acquired by the sensor without interference between the sensor and the target, data can be acquired at preferable and congener times. A remote sensing technology such as this one, in which image information is acquired in close proximity to a plant, the image information is transmitted to a computer, a vegetation index is computed by the computer, and an appropriate harvest time is evaluated or predicted with based on the correlation between the index and assessment items, such as an amount of fiber, was described (see PTL 1). Citation List Patent Literature [005] PTL 1 International Publication WO2009/116613 Summary of the Invention Technical problem [006] However, in the technology described in PTL 1 described above, since a single camera that photographs the agricultural production is provided, when production growth situations vary in a farm region, the growth situations of the entire farm are recognized. with occasionally photographed growth situations from production and thus there are cases where the accuracy of assessing or predicting an appropriate harvest time is diminished. Also, the prior technology is limited in that it fails to address the growth of organisms. Furthermore, it is not possible to understand the growth situations of many farms located in different regions. [007] Furthermore, in the technology described in PTL 1, the accuracy in evaluating a situation of agricultural production growth by computing the vegetation index through an arithmetic operation based on the near-infrared light and red light data from of agricultural production image data using a near-infrared light sensor and a red light sensor is not sufficiently reliable. In other words, it is difficult to increase the accuracy of the assessment by performing the assessment both in combination with assessing a growth situation using yield colors and evaluating a growth situation based on the vegetation index. [008] Furthermore, the technology described in PTL 1 discloses that a dedicated device for remote sensing can be used as a camera. For a device dedicated to remote sensing like this, a multispectral camera (multiband camera) or a hyperspectral camera is used. The first requires the mechanical switching of a bandpass filter, and the synchronization of the image regions is insufficient. Furthermore, since the latter requires scanning, the synchronization of the image regions is insufficient, additionally, since such an optical system is complicated, it is difficult to miniaturize the camera, which is expensive and, additionally, since data takes a large capacity, a communication load increases and thus the camera is not suitable for wireless communication. [009] Furthermore, the technology described in PTL 1 is based on the premise that the assessment result or the appropriate predicted harvest time must be provided to a producer or a manager. In this case, the producer can predict and understand a harvest time, but it is difficult to satisfy demands from resellers, general consumers, consumers such as restaurants, distributors, or other external parties who wish to purchase agricultural produce without going through a reseller and /or who want to know a time of harvesting the production. [0010] It is desirable to be able to properly compute a growth rate and an appropriate forecasted harvest time of agricultural production based on an RGB image and a NIV image, and to be able to distribute information about the growth rate and the appropriate forecasted harvest time not only for a producer and a manager, but also for resellers, general consumers and distributors, among others. Solution to the Problem [0011] Various embodiments of the present description pertain to methods that include: obtaining image information from an organism that includes an optical dataset; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth index, wherein the image information includes at least one of: (a) visible image data obtained from an image sensor and non-visible image data obtained from of the image sensor, and (b) a set of image data from at least two image capture devices, wherein the at least two image capture devices capture the image data set from at least two positions. [0012] Additional modalities concern systems that include: an image capture device, in which at least one of the server and image capture device is configured to: obtain image information from an organism that includes a data set optics; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth index, wherein the image information includes at least one of: (a) visible image data obtained from an image sensor and non-visible image data obtained from of the image sensor, and (b) a set of image data from at least two image capture devices, wherein the at least two image capture devices capture the image data set from at least two positions. [0013] Still further modalities concern non-temporary, tangible computer-readable media that have, stored in them, instructions that cause a processor to execute a method, the method including: obtaining image information from an organism that includes a set of optical data; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth index, wherein the image information includes at least one of: (a) visible image data obtained from an image sensor and non-visible image data obtained from of the image sensor, and (b) a set of image data from at least two image capture devices, wherein the at least two image capture devices capture the image data set from at least two positions. [0014] As used herein in various illustrative embodiments, the terms "production" and "agricultural production" include organisms. An organism is any living system. The living system can be biologically contiguous. [0015] A further definition of organism, as used here, is a structure of molecules functioning as a more or less stable whole that exhibits the properties of life, which includes any living structure capable of growth. Thus, for example, an organism includes, but is not limited to, an animal, fungus, microorganism, and plant. [0016] Therefore, the term "production" and variations thereof, including, but not limited to, "agricultural production", as used herein, includes, but is not limited to, animals such as cows, goats, sheep, pigs, fish and birds. [0017] Thus, for example, the term "growth index" and variations thereof, including, but not limited to, "growth status information", "growth situation information", includes, but is not limited to, growth of organisms, including productions and animals. [0018] Furthermore, for example, the term "harvest" and variations thereof, including, but not limited to, "harvesting", "harvest time information", "appropriate forecasted harvest time", "harvest plan" , "harvest plan information", "harvest start time", "harvest time" and "harvest time limit", refer to the harvest of organisms. In various illustrative embodiments, harvesting includes any gathering of mature organisms, including crops and/or animals. [0019] Thus, the term "assessment of a growth situation" and variations thereof, as used herein, includes assessment of a growth situation of organisms such as animals and crops. Such an assessment may use various properties of the animals and productions, including a growth index and other properties not explicitly listed here. [0020] Methods and systems described here may use optical data. For example, an optical dataset can be used to obtain growth information or a growth rate. Optical data can include captured image data, including visible and non-visible image data. [0021] As used herein, the term "visible image data" may include image data that uses a red - green - blue color model (also referred to as RGB). For example, digital cameras and video cameras often use a particular RGB color space. [0022] As used herein, the term "non-visible image data" may include near-infrared rays (hereinafter also referred to as NIV). [0023] As used herein, the term "external parties" and variations thereof, include general consumers, retailers, restaurants and food producers. For example, external parties can include any person or company related to the supply chain system. [0024] An image capture device, as used in the various illustrative embodiments described herein, is a device that captures image data or image information. For example, an image capturing device may include, but are not limited to, optical devices that store and/or transmit still or moving image data, such as a camera or video camera. [0025] The term "sensor camera" and variations thereof, as used herein, refer to a device that captures images. Sensor cameras can have various functionality, such as the ability to harvest, send and/or store various properties. Such properties may include, but are not limited to, information related to growth, temperature, humidity, and atmospheric pressure. [0026] Furthermore, sensor cameras may have the functionality to transfer information or data over a network or to an external device. For example, sensor cameras can feed information, including captured image data, to a server. [0027] In the description given here, for purposes of illustration, methods may be described in a particular order. It should be noted that, in alternative embodiments, the methods may be performed in a different order than described. It should also be noted that the methods described here can be performed by hardware components or can be incorporated into sequences of machine-executable instructions that can be used to make a machine, such as a general-purpose or special-purpose processor, (GPU or CPU) or logic circuits programmed with instructions, perform methods (FPGA). These machine-executable instructions may be stored on one or more machine-readable media, such as CD-ROMs or other optical disks, floppy disks, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, flash memory, or others. types of machine-readable media suitable for storing electronic instructions. Alternatively, the methods can be performed by a combination of hardware and software. [0028] Specific details are given in the description to provide a thorough understanding of the modalities. However, it will be understood by those skilled in the art that the illustrative embodiments can be practiced without these specific details. [0029] For example, in some cases, well-known circuits, processes, algorithms, structures, and techniques may be shown or discussed without unnecessary detail in order to avoid obscuring illustrative modalities. [0030] Also, it is noticed that the modalities are described as several processes that can be represented as a flowchart, a flow diagram, a data flowchart, a structure diagram or a block diagram, among others. While any of these representations can describe various parts of the operations as a sequential process or sequential processes, many of the operations or parts of the operations can be performed in parallel, concurrently, and/or redundantly. [0031] Furthermore, the order of operations can be rearranged. A process is terminated when its operations are complete, but it may have additional steps or repetitive steps not included in the figure. A process can correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process matches a function, its termination corresponds to a return from the function to the calling function or the main function. [0032] Furthermore, modalities can be implemented by hardware, software, embedded software, mediator software, microcode and hardware description languages, among others, or any combination thereof. When implemented in software, embedded software, mediator software, or microcode, the program code or code segments to perform necessary tasks may be stored on machine-readable media, such as storage media. [0033] A processor(s) can perform the required tasks. A code segment can represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program declarations. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters or memory contents. Information, arguments, parameters, data, etc. can be passed, forwarded or transmitted through any suitable device, including memory sharing, message passing, token passing, network transmission, etc. [0034] While illustrative embodiments of the description have been described in detail herein, it is to be understood that the inventive concepts may otherwise be variably incorporated and employed, and that the appended claims are intended to be interpreted to include such variations. , except as limited by prior technology. Advantageous Effects of the Invention [0035] According to the modalities of the technology currently described, a growth rate and an appropriate predicted harvest time of agricultural production can be computed. In various embodiments, the computation of the growth rate and the appropriate predicted harvest time can be improved over prior art computations. Brief Description Of The Drawings [0036] Figure 1 is an illustrative diagram that shows an example of configuration of an information processing system according to various modalities of the technology currently described. [0037] Figure 2 is an illustrative diagram showing an example of a sensor camera configuration of figure 1 according to various modalities of the technology currently described. [0038] Figure 3 is an illustrative diagram that shows an example of a sensor configuration in the sensor camera of figure 2 according to various modalities of the technology currently described. [0039] Figure 4 is an illustrative diagram showing an example of configuration of a terminal device of figure 1 according to various embodiments of the technology currently described. [0040] Figure 5 is an illustrative diagram that shows an example of a server configuration of figure 1 according to various modalities of the technology currently described. [0041] Figure 6 is an illustrative diagram that shows an example of management information configuration according to various modalities of the technology currently described. [0042] Figure 7 is an illustrative flowchart to describe a process of accumulating information on the growth situation performed by the sensor camera according to various modalities of the technology currently described. [0043] Figure 8 is an illustrative diagram to describe a method of transferring growth situation information between sensor cameras in accordance with various embodiments of the technology currently described. [0044] Figure 9 is an illustrative diagram to describe another method of transferring growth situation information between sensor cameras in accordance with various embodiments of the technology currently described. [0045] Figure 10 is an illustrative flowchart to describe a sensing process performed by the sensor camera according to various modalities of the technology currently described. [0046] Figure 11 is an illustrative flowchart to describe the process of accumulating information on the growth situation performed by a server according to various modalities of the technology currently described. [0047] Figure 12 is an illustrative flowchart to describe a process of receiving the harvest plan performed by the terminal device according to various embodiments of the technology currently described. [0048] Figure 13 is an illustrative diagram to describe a principle of imaging a stereoscopic image according to various embodiments of the technology currently described. [0049] Figure 14 is an illustrative flowchart to describe an inquiry response process between the terminal device and the server in accordance with various embodiments of the technology currently described. [0050] Figure 15 is an illustrative diagram to describe a first example of sensor modification according to various embodiments of the technology currently described. [0051] Figure 16 is an illustrative diagram to describe a second example of sensor modification according to various embodiments of the technology currently described. [0052] Figure 17 is an illustrative diagram to describe a third example of sensor modification according to various embodiments of the technology currently described. [0053] Figure 18 is an illustrative diagram to describe a fourth example of sensor modification according to various embodiments of the technology currently described. [0054] Figure 19 is an illustrative diagram to describe an example of configuration of a general purpose personal computer according to various embodiments of the technology currently described. Description of Illustrative Modalities [0055] In the following, various illustrative embodiments for the present description (hereinafter referred to as embodiments) will be described. Note that the description will be provided in the following order. 1. First modality (Example of configuring a modality of an information processing system) 2. First modification example (First modification example of a sensor structure) 3. Second modification example (Second modification example of the sensor structure sensor) 4. Third modification example (Third sensor structure modification example) 5. Fourth modification example (Fourth sensor structure modification example) (1. First modality) <Example of configuring an information processing system> [0056] First, with reference to figure 1, an example configuration of an information processing system that is an example configuration illustrative of various modalities of the technology currently described will be described. [0057] The information processing system of figure 1 is configured to include sensor cameras 11-1 through 11-N, terminal devices 12-1 through 12-4, each managed by a consumer, a reseller, a distributor and a farmer, a network 13 and a server 14. In the information processing system of figure 1, images captured by sensor cameras 11-1 through 11-N are supplied to the server 14 through the network 13 represented by the Internet and thus, the server 14 computes an agricultural production growth rate and computes an appropriate predicted harvest time based on the growth rate. Furthermore, the server 14 responds to inquiries, such as an appropriate predicted harvest time, from terminal devices 12-1 through 12-4, each managed by the consumer, the dealer, the distributor and the farmer, among others. external parts. [0058] In more detail, sensor cameras 11-1 to 11-N are arranged so that an entire cropland can be imaged at predetermined intervals of cropland so that agricultural production is managed (or for which regions that may be close to all cultivated land can be imaged by sensor cameras 11-1 through 11-N as a whole), images that include RGB pixels and NIV pixels are captured, and the captured image data is transmitted to the server 14 via the network 13. Furthermore, the sensor cameras 11 measure temperature, humidity and atmospheric pressure information, among others, such as ambient information, and supply the information as well as the captured image data. , to the server 14 as growth status information. Note that Sensor Cameras 11-1 through 11-N are referred to simply as Sensor Cameras 11 unless otherwise specified in this and other configurations. [0059] Terminal devices 12-1 to 12-4 are information processing devices configured, for example, as personal computers, among others (also including mobile terminals, such as so-called smart phones) managed, respectively, by a consumer, a dealer, a distributor and a farmer, and make inquiries of the information of a growth rate and an appropriate forecasted harvest time, among others, through the network 13, and receive and display information in response to inquiries in relation to the server 14. [0060] Server 14 acquires and accumulates growth situation information based on the image data, and other information supplied from sensor cameras 11, and computes an appropriate growth rate and predicted harvest time based on the image data. Furthermore, the server 14 also uses past growth situation information in addition to the image data supplied from the sensor cameras 11 to compute an appropriate predicted harvest time. Additionally, when the appropriate predicted harvest time computed based on the growth situation information arrives, the server 14 notifies the terminal devices 12-1 through 12-4 managed, respectively, by the consumer, the dealer, the distributor and the farmer. from the information that the appropriate forecasted harvest time has arrived through the network 13. Note that the appropriate forecasted harvest time may be a date forecast as an appropriate day to start harvesting, or a day prior to a predetermined number of days from the due date, or for a predetermined number of days from the previous day to a few days from the due date. <Configuration example to perform a function of sensor cameras> [0061] Referring to Figure 2, an illustrative configuration example to perform a function of sensor cameras 11 will be described. [0062] Each sensor camera 11 is provided with a sensor 31, an RGB imaging unit 32, an NDVI imaging unit 33, a control unit 34, an IP address storage unit 35, a GPS 36, an environment information measuring unit 37, an RTC 38, a growth situation information generating unit 39, a communication unit 40 and a communication path specification unit 41. The sensor 31 is configured, for example, as an image sensor, and has a pixel array as illustrated, for example, in Figure 3. In other words, in the pixel array of sensor 31, any of the green arrays in a Bayer array composed of general RGB (red, green and blue), as shown in a P1 image, is made up of near-infrared ray pixels. Note that in the following drawings, the horizontally striped pattern indicates green, the vertically striped pattern indicates blue, the upper shaded part indicates red, and the lower shaded portion indicates near-infrared rays. [0063] RGB imaging unit 32 generates an RGB image from image signals captured by sensor 31. In other words, RGB imaging unit 32 extracts green, red and blue signals based on the signals of image captured by the sensor 31 with the pixel arrays, as shown in the image P1 of figure 3, as shown, respectively, by the images P11 to P13, and thereby generates images of the component signal of green, red and blue , as shown by images P21 to P23 by the demosaicing of the signals. Additionally, the RGB image generating unit 32 generates an RGB image, as shown by an image P31, by imaging the RGB component signal, as shown by images P21 through P23. [0064] The Normalized Difference Vegetation Index (NDVI) imaging unit 33 generates NIV images from signals from an image captured by sensor 31. In other words, the NDVI imaging unit 33 extracts NIV signals , in the manner shown by an image P14, based on the image signals captured by the sensor 31 with the pixel arrays, in the manner shown by the image P1 of Fig. shown by a P24 image. Furthermore, the NDVI imaging unit 33 generates an NDVI image based on the NIV component signal image and an image of the red component signal generated by the above-described RGB image generating unit 32. Note that the Normalized Difference Vegetation Index (NDVI) will be described in detail below. [0065] The control unit 34 consists, for example, of a microprocessor, a memory and the like, it executes several processes by carrying out programs stored in the memory and, in this way, controls the entirety of the operations of the sensor cameras 11. [0066] The Internet Protocol (IP) address storage unit 35 stores IP addresses, which are information to individually identify sensor cameras 11, and can supply the IP address information to the control unit 34. The System Positioning System (GPS) 36 receives radio waves from satellites not shown in the drawings, computes positional information, such as the longitude and latitude of the Earth on which sensor cameras 11 are installed, and supplies the information to the unit. control unit 34. The ambient information measuring unit 37 measures information on temperature, humidity and atmospheric pressure, among others, such as information on the environment in which the sensor cameras 11 are installed, and supplies the information to the control unit 34 The unit includes a Real Time Clock (RTC), and generates time information all the time and supplies the information to the control unit 34. Note that, here, the example in which IP addresses are used Given as information to individually identify sensor cameras 11 is described, however, as information that can individually identify sensor cameras 11, information other than IP addresses can be used. [0067] When the sensor 31 captures an image, the growth situation information generation unit 39 generates growth situation information that includes IP addresses, RGB image, NDVI image, positional information and environment information along with capture timing information. Note that information other than IP addresses, RGB image, NDVI image, positional information and environment information can be included in the growth situation information, as long as a growth situation can be verified with the information. [0068] The communication unit 40 is a unit that performs wired or wireless communication via the network 13, such as the Internet, including, for example, an Ethernet card, among others, and controlled by the control unit 34 to transmit the growth situation information to the server 14. The communication path specification unit 41 specifies a communication path during transmission of the growth situation information by the communication unit 40. In other words, the specification unit of the communication path 41 transmits growth situation information, which must be supplied to server 14 by numerous sensor cameras 11, to server 14 in the form of sequential relay between sensor cameras 11. In other words, when growth status of each of sensor cameras 11-1 through 11-3 is transmitted, sensor camera 11-1 transmits its growth status information to the sensor camera 11-2, and sensor camera 11-2 supplies the growth situation information supplied from sensor camera 11-1 and its own growth situation information to sensor camera 11-3. Furthermore, sensor camera 11-3 supplies growth situation information from sensor cameras 11-1 and 11-2 and its own growth situation information to server 14. In order to carry out communication, the communication path specification unit 41 specifies a communication path by deciding which sensor cameras 11 should be passed to transmit growth situation information from a sensor camera. As a specific illustrative example, when a communication path specification unit 41 of a sensor camera 11 communicates with a communication path specification unit 41 of a nearby sensor camera 11 via a communication unit 40 and capture images together that make a pair in order to constitute, for example, a stereoscopic image to be described later, each sensor camera defines and specifies a way to transmit information of the growth situation. With this process, the complexity in terms of the communication path can be reduced, and a communication speed can be increased. This form of communication can be the same as the near field communication represented, for example, by Zigbee (trademark). Note that a communication path can be used if parts of the growth situation information can be sequentially transmitted en route to the server 14 more efficiently, the form of retransmission described above being a mere example, and parts of the information can be transmitted in another way. <Configuration example to perform a function of terminal devices> [0069] Referring to Figure 4, an illustrative configuration example to perform a function of terminal devices 12, each managed by the consumer, the dealer, the distributor and the farmer, will be described. [0070] Each of the terminal devices 12 managed by the consumer, the dealer, the distributor and the farmer is configured to include a control unit 61, an investigation unit 62, an operation unit 63, a communication unit 64, an IP address storage unit 65 and a display unit 66. The control unit 61 may include a microprocessor and a memory, among other components, and controls all operations of the terminal device 12 with the microprocessor executing stored data and programs in the memory. When there is an instruction to make an investigation of all or some of the images captured by the sensor cameras 11, a growth rate and an appropriate forecasted harvest time through an operation of the operating unit 63 that includes a keyboard and a mouse, among others, the investigation unit 62 controls the communication unit 64 which includes an Ethernet card, for example, in such a way that investigation information is generated to make an investigation of the images captured by the sensor cameras 11, of a growth rate and of an appropriate forecasted harvest time on the server 14 together with IP address information to specify the sensor cameras 11 that are stored in the IP address storage unit 65 and managed by the investigation unit itself (or from which the investigation unit wishes do an investigation). The investigation unit 62 transmits the generated investigation information to the server 14 using the communication unit 64. Furthermore, the communication unit 64 receives the transmitted response information from the server 14 in response to the investigation information and supplies the information to the control unit 61. The control unit 61 causes the display unit 66, which includes a liquid crystal display (LCD) and an organic EL (ElectroLuminescence), among others, to display the response information. <Configuration example to perform a server function> [0071] Referring to Figure 5, an illustrative configuration example to perform a server 14 function will be described. [0072] The server 14 is configured to include a control unit 81, a growth situation information accumulation unit 82, a target region specification unit 83, an RGB image growth index computing unit 84, an NDVI image growth rate computing unit 85, a stereoscopic image growth rate computing unit 86, a crop time computing unit 87, a management information accumulation unit 88, a computing unit of growth rate 89, a mapping unit 90, a sensor camera operation status monitoring unit 91, a communication unit 92, a crop plan creation unit 93, a distribution plan creation unit 94, a sales plan creating unit 95, a procurement plan creating unit 96, an inquiry receiving unit 97, and a response creating unit 98. [0073] Control unit 81 may include a microprocessor, memory and the like, and controls all operations of server 14 by executing data and programs stored in memory. [0074] Growth situation information accumulation unit 82 stores growth situation information supplied from sensor cameras 11 via communication unit 92 in association with IP addresses used to identify sensor cameras 11. [0075] Target region specification unit 83 specifies a region in an image in which agricultural production to be monitored is present based on an RGB image included in the growth situation information. As an illustrative specific example, the target region specification unit 83 stores patterns of colors and shapes that serve as trait information for each crop, and specifies a target region by searching for a region that corresponds to the trait information in the RGB image. . Note that here, the target region specification unit 83 which is described provided in the server 14, can, however, be provided in each sensor camera 11, so that, for example, target region information is included in the target region information. growth situation. Furthermore, since the target region specification unit 83 only needs to be able to specify a target region, the target region specification unit can specify a target region only by using an image other than an RGB image, for example a NIV image. . [0076] The RGB Image Growth Index Computing Unit 84 computes a growth index based on information from a specified image region as a target region from an RGB image. For example, since the time at which a ratio of green responsible for one husk of rice in a year is about 10% is defined as a start time of harvest and the time at which a ratio of this is about 2% is defined as a harvest time threshold, the growth rate computation unit of the RGB image 84 computes a growth rate based on the green ratios of the rice husks. Since the RGB image growth rate computing unit 84 computes an RGB image growth rate only using image information from a region in an RGB image in which a target is present, the image growth rate computing unit 84 RGB image can compute the growth rate with higher accuracy. [0077] The NDVI Image Growth Index Computing Unit 85 computes a growth index based on information from an image region specified as a target region in an NDVI image. Here, an NDVI indicates a normalized vegetation index as expressed by the following formula (1).NDVI = (R_NIV - R_RED) / (R_NIV + R_RED) ... (1) [0078] In formula (1), NDVI is a normalized vegetation index, R_NIV is the near-infrared light reflectance, and R_RED is the red light reflectance. Thus, the NDVI imaging unit 33 of the above-described sensor camera 11 generates an image obtained from an arithmetic operation of the above-described formula (1) as an NDVI image. An NDVI is used as a foliage growth index. Note that the reflectances of near-infrared light and red light are computed by obtaining the red light intensity and the NIV intensity in a region that is not a target region, e.g. the sky, as the incident light intensity, and by obtaining of the red light intensity and the NIV intensity in a target region as reflected light intensity in an RGB image and a NIV image. Furthermore, the reflectances of near-infrared light and red light can also be obtained by measuring the intensity of incident light relative to a diffuser panel that has a known reflectance, by calculating a reflection coefficient from a ratio between the reflection intensity and luminance of a target, and by converting the coefficient into a reflectance. Furthermore, the NDVI image growth index computing unit 85 computes an NDVI image growth index from the average value, variance, or high-order variance of an NDVI from a target region only. With the operation, the NDVI image growth index is computed only from the information obtained from the pixels in the target region, and the NDVI image growth index can be computed with higher precision. [0079] The stereoscopic image growth rate computing unit 86 generates a parallax image based on information from the regions of the same target captured by the plurality of sensor cameras 11, acquires agricultural production target sizes as stereoscopic information and computes a stereoscopic image growth index based on image information that includes stereoscopic sizes. [0080] The Harvest Time Computing Unit 87 computes an appropriate predicted harvest time based on an RGB growth index, an NDVI growth index, a stereoscopic image growth index, and past information from the above mentioned information. elements accumulated in the growth situation information accumulation unit 82. [0081] The management information accumulation unit 88 stores information on a sensor position, a region (a country, a city, etc.), the type of agricultural production, the owner of the agricultural production (or a farm), a Gp farm or field, a contract distributor, a contract reseller, a group, a growth rate and an appropriate forecasted harvest time for each IP address to identify the sensor cameras 11 as shown in figure 6. No sensor position field, information acquired by GPSs 36 provided in sensor cameras 11 is recorded. In the field of a region, a country, a city, etc. defined in association with a sensor position is recorded. In the field of type of agricultural production, information indicating the type of agricultural production cultivated in a cultivation area monitored by sensor cameras 11 is recorded. In the field of the owner of agricultural production (or a farm), information of the owner of agricultural production or a farm for which 11 sensor cameras specified by IP addresses are installed is registered. In the farm field or Gp field, a group, etc. managed, for example, by the same owner is registered. In the field of the distributor by contract, information of a distributor that will transport the agricultural production monitored by the sensor cameras 11 that are identified by the IP addresses is registered. In the contract dealer field, information of a contract dealer who will sell the agricultural production monitored by the 11 sensor cameras that are identified by the IP addresses is recorded. In the group field, a group name assigned to regions in which harvesting is carried out at the same time is recorded. In the growth rate field, an agricultural production growth rate in the range monitored by 11 sensor cameras that are identified by IP addresses is recorded. In the appropriate predicted harvest time field, information of an appropriate predicted harvest time that is predicted based on the growth rate and past information from this is recorded. [0082] In figure 6, AAA, BBB and CCC are recorded as IP addresses. Furthermore, a sensor position with an IP address of AAA is indicated by A, the region by a, the type of agricultural production by Alpha, the owner of agricultural production by "Kou", the farm or field Gp by G1, the contract distributor by (1), the contract dealer by "Ah", the group by i, the growth index by 60 and the appropriate forecasted harvest time by October 15th. [0083] In a similar way, a sensor position with an IP address of BBB is indicated by B, the region by one, the agricultural production type by Alpha, the agricultural production owner by "Kou", the farm or field Gp by G1, the contract distributor by (1), the contract dealer by "Ah", the group by i, the growth index by 70, and the appropriate forecast harvest time by October 16th. [0084] Furthermore, a sensor position with an IP address of CCC is indicated by C, the region by c, the type of agricultural production by Beta, the owner of the agricultural production by "Otsu", the farm or field Gp by G2, the contract distributor by (2), the contract dealer by "Eah", the group by ii, the growth index by 65, and the appropriate forecasted harvest time by October 20th. [0085] The growth index computing unit 89 computes a defined growth index, for example, as a weighted average of an RGB growth index, an NDVI growth index, and a stereoscopic image growth index based on any one or all indices. [0086] The mapping unit 90 generates information obtained by mapping the expected growth rates and appropriate harvest times as information on maps of each region. [0087] When the sensor camera operating situation monitoring unit 91 compares time series changes of the RGB images included in the growth situation information and there are extremely drastic changes, the sensor camera operating situation monitoring unit monitors operating situations by determining whether or not there is an abnormal operating state occurring in sensor cameras 11. [0088] The communication unit 92 may include an Ethernet card and the like, and is controlled by the control unit 81, thereby receiving the growth situation information and the investigation information transmitted from the terminal devices 12 and transmitting the response information for the terminal devices 12. [0089] The harvest plan creation unit 93 generates harvest plan information from the harvest time information based on the growth situation information and the appropriate forecast harvest time information, and transmits the information to the terminal device 12 managed and operated by the farmer by using the communication unit 92. Note that crop plan information can be transmitted not only to the terminal device 12 managed and operated by the farmer, but also to the terminal devices 12 managed and operated by the farmer. terminal 12 managed and operated by the distributor, the dealer and the consumer. Due to transmission, the distributor, dealer and consumer can also formulate their own distribution plans, sales plans and procurement plans from the harvest plan information. [0090] The distribution plan creation unit 94 generates distribution plan information from the harvest time information based on the growth situation information and the appropriate forecast harvest time information, and transmits the information to the terminal device 12 managed and operated by the distributor by using the communication unit 92. [0091] The sales plan creation unit 95 generates sales plan information from the harvest time information based on the growth situation information and the appropriate forecasted harvest time information, and transmits the information to the terminal device 12 managed and operated by the dealer using communication unit 92. [0092] The acquisition plan creation unit 96 generates acquisition plan information from the harvest time information based on the growth situation information and the appropriate forecast harvest time information, and transmits the information to the terminal device 12 managed and operated by the consumer using the communication unit 92. [0093] The investigation receiving unit 97 controls the communication unit 92 in such a way that investigation information is received including investigations regarding a collection time, among other information, transmitted from terminal devices 12 operated by any one from the consumer, the dealer, the distributor and the farmer, for example, through the network 13. [0094] Response creation unit 98 generates response information that includes, for example, growth rate mapping information generated by mapping unit 90 corresponding to information received as investigation information, and controls communication unit 92 of such that the response information is transmitted to the terminal devices 12 which transmitted the inquiry information. <Process of accumulating growth situation information by a sensor camera> [0095] In relation to figure 7, a process of accumulating information from the illustrative growth situation by a sensor camera 11 will be described. [0096] In Step S11, the control unit 34 of the sensor camera 11 determines whether or not a predetermined time has elapsed from a previous sensing process based on the time information generated by the RTC 38 and the time information in which the previous sensing process has started. When a predetermined time has not elapsed from the previous sensing process in Step S11, the process proceeds to Step S15. [0097] In Step S15, the control unit 34 determines whether or not an end of an operation has been instructed through an operation of an operation unit not shown in the drawings. When an end of operation is instructed in Step S15, the process ends, and when an end of operation is not instructed, the process returns to Step S11. In other words, the processes of Steps S11 and S15 are repeated until an end of the operation is instructed or the predetermined time elapses. Furthermore, when the predetermined time elapses in Step S11, the process proceeds to Step S12. [0098] In Step S12, the sensor 31 performs a sensing process, and acquires an RGB image and an NDVI image from the sensing process. Note that several illustrative modalities of the sensing process will be described below in detail in relation to the flowchart in figure 10. [0099] In Step S13, the growth situation information generation unit 39 generates growth situation information based on the RGB image and the NDVI image acquired from the sensing process, on the IP addresses stored in the storage unit of the IP address 35, positional information including longitude and latitude on Earth acquired by GPS 36, temperature, humidity and atmospheric pressure information measured by ambient information measurement unit 37 and time information generated by RTC 38. Note that , since the growth situation information only needs to include information that indicates a growth situation of agricultural production or information to recognize the growth situation, the growth situation information can include information that indicates a growth situation or information to recognize the growth situation, in addition to the RGB image and the NDVI image, the IP address, the positional information that in includes longitude and latitude on Earth, temperature, humidity and atmospheric pressure information, and weather information. [00100] In Step S14, the control unit 34 controls the communication unit 40 in such a way that the generated growth situation information is transmitted to the server 14, and the process returns to Step S11. At this time, in various embodiments, the control unit 34 controls the communication path specification unit 41 to communicate with a peripheral sensor camera 11, then to specify a sensor camera 11 which will be passed to transmit situation information to the server 14 and then to transmit the information of the growth situation to the server 14 via the specified sensor camera 11 in the communication path. [00101] In other words, as illustrated in figure 8, when positions in which sensor cameras 11 are installed are indicated by nodes N1 to N10, for example, and information is defined as transmitted to the Internet via a station base K and a communication port GW, growth situation information from sensor camera 11 corresponding to node N5 is transferred to base station K via sensor camera 11 indicated by node N4 and sensor camera 11 indicated by node N3 that are located nearby. In this illustrative case, the sensor camera 11 indicated by node N4 transfers the growth situation information from node N5 and its own growth situation information to node N3. Furthermore, Node N1 and N2 transfer their own growth situation information to Node N3. Furthermore, node N3 rearranges, and transfers to base station K, the growth situation information from nodes N1 to N5. Furthermore, the sensor camera 11 indicated by the node N7 transfers the information of the growth situation to the sensor camera 11 indicated by the node N6, and the sensor camera 11 indicated by the node N6 rearranges the information of the growth situation of the nodes. N6 and N7, and transmits the information to the Internet via the base station K and the gateway GW. Furthermore, sensor cameras 11 indicated by nodes N9 and N10, respectively, transfer their growth situation information to sensor camera 11 indicated by node N8, and sensor camera 11 indicated by node N8 rearranges the information from the growth situation from nodes N8 to N10, and transmits the information to the Internet through the base station K and the communication port GW. [00102] Due to the above process, the complexity caused by the communication between the base station K and the communication port GW can be alleviated more and the growth situation information can be transferred at a higher speed than when the situation information of growth from all 11 sensor cameras is transmitted at once. Note that since growth status information from all sensor cameras 11 only needs to be transmitted to server 14 efficiently, growth status information from all sensor cameras 11 can be transferred using different methods of being transferred between the sensor cameras 11 in the form of a relay, as exposed, or, for example, can be transferred directly to the base station K from each of the sensor cameras 11. Furthermore, each of the sensor cameras 11 can rearrange and transfer growth status information from another sensor camera 11, or may sequentially transfer each piece of growth status information to base station K in a predetermined order. As an example, when each of the sensor cameras 11 transfers the growth situation information directly to the base station K, the information can be transferred from each of the sensor cameras 11 to the base station K, and this can happen with more efficiency. [00103] Furthermore, as illustrated in Figure 9, when sensor cameras 11 indicated by nodes N11 through N17, N21 through N23, and N31 through N33 are installed, configuration can be made in which, in relation to sensor cameras 11 indicated, for example, by nodes N21 to N23 and N31 to N33, sensor cameras 11 indicated by nodes N21 to N23 are defined as a first group G11, sensor cameras 11 indicated by nodes N31 to N33 are defined as a second group G12, parts of the growth situation information are collected at a representative node of each group, and the sensor camera 11 of the representative node rearranges and transmits parts of the growth situation information from the sensor cameras 11 of the other nodes that belong to the group. Additionally, regarding the definition of groups G11 and G12, for example, sensor cameras 11 present on cultivated land owned by the same owner can be defined as in the same group, or sensor cameras 11 paired in order to capture a stereoscopic image described here. can be set as in the same group. [00104] From the process described above, parts of the growth situation information that include RGB images and NDVI images can be generated at a predetermined time interval, sequentially transmitted to the server 14 and sequentially accumulated in the server 14. <Sensing process> [00105] In relation to figure 10, an illustrative process of sensing will be described. [00106] In Step S31, the sensor 31 captures an image that has a size at which the size and color of the agricultural produce to be harvested can be fully recognized in a crop range of the produce that is an object. Furthermore, the sensor cameras 11 are installed at a range and in a direction in which the imaging of cultivated land under the imaging conditions described above can be carried out. [00107] In Step S32, the RGB image generation unit 32 and the NDVI image generation unit 33 perform a demosaicing process in the light beams of each color of pixels captured by the sensor 31. In other words, the image unit RGB image generation 32 performs the demosaicing process on the respective red, green and blue light pixels to generate images of the red, green and blue component signal. Furthermore, the NDVI imaging unit 33 performs a demosaicing process on the NIV pixels to generate an image of the NIV component signal. [00108] In Step S33, the RGB image generating unit 32 combines the demosified RGB component signal images to generate an RGB image. [00109] In Step S34, the NDVI imaging unit 33 measures the intensity of NIV and red light serving as incident light from a region recognized as an image of the sky for each pixel and measures the intensity of NIV and light that serve as reflected light in regions other than the aforementioned region based on the NIV image and the red image, computes reflectances of NIV and red light, and generates an NDVI image. For this reason, the sensor 31 is installed at an angle in which the captured region of agricultural production, which is an object, and a region in which incident red light or NIV from the sky can be measured are included. Furthermore, when it is difficult to install the sensor at this angle, a pan and tilt mechanism is provided on the 11 sensor cameras, red light incident light and NIV are captured with the cameras facing the sky, the cameras are controlled accordingly. In such a way that regions of agricultural production, which comprise an object, are image treated, reflected light is captured, and the NDVI image described above is generated. Furthermore, the reflectances of NIV and red light can also be obtained by measuring the intensity of incident light relative to a diffuser panel that has a known reflectance, by calculating a reflection coefficient from a ratio of the intensity and the reflection luminance of a target and by converting the coefficient into a reflectance. [00110] In Step S35, the control unit 34 controls the ambient information measuring unit 37 in such a way that temperature, humidity and atmospheric pressure that constitute the ambient information are measured. [00111] With the process described above, information that constitutes information of the growth situation, such as the RGB image, the NDVI image, and temperature, humidity and atmospheric pressure, which are included in the measured environment information, is generated. Note that the information that constitutes the growth situation information may include information other than the RGB image, the NDVI image, and temperature, humidity and atmospheric pressure, which are included in the environment information. Such information may include information that is needed to recognize a growth situation. <Processes of accumulating growth situation information by the server and by each terminal device> [00112] With reference to figures 11 and 12, processes of accumulating growth situation information by the server 14 and by each terminal device 12 will be described. [00113] In Step S61, the control unit 81 of the server 14 controls the communication unit 92 in such a way that whether or not growth situation information has been transmitted from some sensor camera 11 is determined, and when the information is not transmitted, the process proceeds to Step S82. [00114] In Step S82, the control unit 81 determines whether or not an operation unit not shown in the drawings has been operated and an end of an operation has been instructed, and when an end thereof is instructed, the process ends. Furthermore, when an end of this is not instructed, the process returns to Step S61. In other words, when an end of this is not instructed and the growth situation information is not transmitted, the processes of Steps S61 and S82 are repeated. When the growth situation information is transmitted in Step S61, for example from the process of Step S14 of Fig. 7, the process proceeds to Step S62. [00115] In Step S62, the control unit 81 controls the communication unit 92 in such a way that the growth situation information transmitted from the sensor cameras 11 is received and controls the growth situation information accumulation unit 82 in such a way that the information on the growth situation received is accumulated. In this illustrative case, the received growth status information may comprise a plurality of pieces of growth status information from the plurality of sensor cameras 11, as described herein. Thus, the plurality of growth situation information pieces can be accumulated through a single process. However, in the following description, the entirety of the process proceeds by considering that parts of information from the growth situation are transmitted from two sensor cameras 11, which capture the same target as a stereoscopic image, in a reception process, although others processes are also encompassed by various embodiments of the present description. [00116] In Step S63, the control unit 81 controls the target region specification unit 83 such that a target region, which is a region in an image that is obtained by imaging the agricultural production target based on in an RGB image included in the transmitted growth situation information, be specified. As an illustrative specific example, the target region specification unit 83 extracts resource information such as the shape and hue of agricultural production for which a growth rate is to be computed from the RGB image. Furthermore, the target region specification unit 83 determines whether or not the extracted resource information corresponds to the actual shape and hue of agricultural production stored in advance, and specifies a target region which includes the region in the corresponding RGB image in which the production agricultural crop whose growth rate must be computed undergoes image treatment. Note that, in this illustrative case, in relation to the agricultural production to be specified, for example, the target region specification unit 83 can specify the resource information by searching for the management information accumulated in the management information accumulation unit 88 with based on the IP addresses included in the growth situation information that includes the RGB image, and by reading and using the information recorded in the field of the type of agricultural production, as shown in Figure 6. [00117] In Step S64, the control unit 81 controls the RGB image growth rate computing unit 84 such that an RGB image growth rate is computed based on the target region in the RGB image. As an illustrative specific example, in relation to a rice harvest time, for example, the RGB image growth index computing unit 84 considers the time at which a ratio of green responsible for a rice husk in a year is around 10 % as a harvest start time, and the time at which a ratio of this is about 2 % as a harvest time threshold and thus a growth index is computed based on the husk green ratios of rice, and the index is defined as an RGB image growth index. [00118] In Step S65, the control unit 81 controls the NDVI image growth index computing unit 85 such that an NDVI image growth index is computed based on the target region in an NDVI image. As an illustrative specific example, the NDVI image growth index computing unit 85 computes, for example, the average value, variance or high order variance of an NDVI of the target region, thereby computing the growth index of the NDVI image. [00119] In Step S66, the control unit 81 controls the stereoscopic image growth index computing unit 86 such that a stereoscopic image growth index is computed based on a stereoscopic image. As a specific illustrative example, the stereoscopic image growth index computing unit 86 extracts two RGB images included in the growth situation information from at least two sensor cameras 11 that capture RGB images to constitute the stereoscopic image. In other words, as illustrated in figure 13, sensor cameras 11-1 and 11-2 image the same agricultural production M1 from different angles, and the stereoscopic image growth index computing unit 86 generates a stereoscopic image, that is, a parallax image, from two RGB images captured by the two sensor cameras 11-1 and 11-2. Furthermore, the stereoscopic image growth index computing unit 86 generates a three-dimensional image of the agricultural produce present in the target region based on the parallax image, and computes a stereoscopic image growth index from the size. Note that the target region in the processes of Steps S65 and S66 can be a different region from a region obtained from the RGB images, provided that a production region that is a target can be specified from it, or a region that, for For example, belongs to any one of a region of the NDVI image that has a high probability of a target present in itself and that has an NDVI value higher than a predetermined value and a region obtained from the RGB images can be defined as the target region. [00120] In Step S67, the control unit 81 controls the growth index computing unit 89 such that a growth index of the agricultural production target is computed based on the growth index of the RGB image, the index of NDVI image growth and stereoscopic image growth index. As a specific illustrative example, the growth rate computing unit 89 may consider the average of the three types of growth rates as a growth rate, it may consider the weighted sum of the rates as a growth rate, or it may select one of indices as a growth index. Furthermore, when it is not possible to compute all of the RGB image growth index, NDVI image growth index and stereoscopic image growth index, the mean value or weighted sum of computable growth indices can be defined as a growth index. [00121] In Step S68, the control unit 81 searches for management information among the management information accumulated in the management information accumulation unit 88, which corresponds to the IP address included in the growth situation information transmitted from the monitoring cameras. sensor 11, and updates the growth rate included in the fetched management information to a value computed from the above-described process. [00122] In Step S69, the control unit 81 controls the harvest time computing unit 87 such that a harvest time is computed based on the growth rate, environment information, growth situation information and information on past harvest times. In other words, the harvest time computing unit 87 computes a predicted harvest time of this season from information of a change in a growth rating index for this season based on the relationship between information of a change in crop indices growth assessment and information from past harvest times as an appropriate predicted harvest time. [00123] In Step S70, the control unit 81 searches for management information among the management information accumulated in the management information accumulation unit 88, which corresponds to the IP address included in the growth situation information transmitted from the monitoring cameras. sensor 11, and updates the appropriate predicted harvest time information included in the fetched management information to a value computed from the above-described process. [00124] In Step S71, the control unit 81 controls the operating situation monitoring unit of the sensor camera 91 in such a way that whether or not there is an abnormality occurring in the sensor cameras 11 that transmit the information of the growth situation based on the RGB images is determined. As a specific illustrative example, the sensor camera operating situation monitoring unit 91 compares a current RGB image with an RGB image captured in the previous sync, both having the same IP address included in the growth situation information transmitted among the information. of the growth situation accumulated in the growth situation information accumulation unit 82, and determines whether or not there is an abnormality occurring in the sensor cameras 11 based on whether or not a change between the images is greater than a predetermined value. In other words, 11 sensor cameras are basically fixed point cameras, and there will not be a significant change in RGB images even if a predetermined time of an image processing interval is, for example, about a day. Thus, if there is a significant change, it is considered that a problem has arisen in the sensor cameras 11. Therefore, when the sensor camera operation situation monitoring unit 91 compares the currently transmitted RGB image with the previous RGB image, and considers that there is an abnormality occurring in the cameras due to a significant shift between images, the process proceeds to Step S72. Note that the occurrence of an abnormality in the sensor cameras 11 can also be determined from a comparison of NIV images, NDVI images, mean NDVI values, variances, high order variances and growth indices. [00125] In Step S72, the sensor camera operation status monitoring unit 91 considers that there is an abnormality occurring in the sensor cameras 11 which transmits the growth status information, fetches management information based on the IP address of the sensor cameras 11 and notifies a terminal device 12 managed and operated by the owner of the agricultural production (cultivated land) included in the fetched management information or a cell phone not shown in the event drawings. [00126] On the other hand, when it is considered that no abnormality occurs in Step S71, the process of Step S72 is skipped. [00127] In Step S73, the control unit 81 determines whether or not the growth rate is higher than a predetermined threshold value and the appropriate predicted harvest time is approaching. In Step S73, when, for example, the growth rate is higher than the predetermined threshold value and the appropriate predicted harvest time is approaching, in other words, when the day corresponding to the appropriate predicted harvest time or a day a predetermined number of days before the day is coming, the process proceeds to Step S74. [00128] In Step S74, the control unit 81 controls the crop plan creation unit 93 such that a crop plan is created. For example, the crop plan creation unit 93 estimates a number of crops to be harvested from a range in which appropriate forecasted harvest times overlap on the management information managed according to IP addresses, and makes a harvest schedule of how to proceed with the harvesting process from a harvest start day based on the processing performance of agricultural machinery for harvesting which can be registered in advance by the owner of the same agricultural production (cultivated land). [00129] In Step S75, the control unit 81 controls the communication unit 92 in such a way that the crop plan information created by the crop plan creation unit 93 is transmitted to the terminal device 12 managed and operated by the farmer. Note that harvest plan information can also be transmitted to terminal devices 12 managed and operated by the distributor, dealer and consumer. With this operation, the distributor, reseller and consumer can make their own distribution plans, sales plans and acquisition plans based on the information in the harvest plan. [00130] After this process, in Step S91 (of figure 12), the control unit 61 of each terminal device 12 controls the communication unit 64 in such a way that whether the harvest plan has been transmitted or not is determined, and the same process is repeated until the plan is transmitted. When the harvest plan is transmitted through, for example, the process of Step S75 of Fig. 12, in Step S91, the process proceeds to Step S92. [00131] In Step S92, the control unit 61 controls the communication unit 64 in such a way that the transmitted harvest plan information is received. [00132] In Step S93, control unit 61 causes crop plan information received by communication unit 64 to be displayed on display unit 66. [00133] In Step S76 (of figure 11), the control unit 81 controls the distribution plan creation unit 94 in such a way that a distribution plan is created. For example, the distribution plan creation unit 94 estimates a number of crops to be harvested from a range in which appropriate forecasted harvest times overlap on the management information managed according to IP addresses, and makes a distribution schedule of how to proceed with the distribution process from the harvest start day based on the transport performance of the distribution vehicles for distribution which can be registered in advance by the distributor by contract. [00134] In Step S77, the control unit 81 controls the communication unit 92 in such a way that the distribution plan information generated by the distribution plan creation unit 94 is transmitted to the terminal device 12 managed and operated by the contract distributor. Note that since a process performed at the terminal device 12 is merely receiving and displaying the distribution plan, rather than the harvest plan in the process described with respect to the flowchart of Fig. 12, the description of the process is omitted. Furthermore, in the processes of Steps S76 and S77, a distribution plan, a sales plan, and an acquisition plan for all cultivated land, including cultivated land where contracts are not made with the distributor, dealer and consumer , can be transmitted to the distributor, reseller and consumer who are shown in figure 1 as contracting a series of services. In addition, in the processes of Steps S76 and S77, a distribution plan, a sales plan, and an acquisition plan for cultivated land in regions that are in the distributor, dealer, and consumer business scopes can be transmitted to the distributor. , the reseller and the consumer who are shown in figure 1 as contracting a series of services. In an illustrative case like this, for a large distribution company, for example, the distribution plan can be transmitted to its branches. [00135] In Step S78, control unit 81 controls sales plan creation unit 95 such that a sales plan is created. As an illustrative specific example, the sales plan creation unit 95 estimates a number of crops to be harvested from a range in which appropriate forecasted harvest times overlap on the management information managed according to IP addresses, and makes a schedule of sales on how to proceed with sales from the start day of harvest based on a displayable production quantity in a warehouse from which it is recorded in advance by the dealer by contract. [00136] In Step S79, the control unit 81 controls the communication unit 92 in such a way that sales plan information created by the sales plan creation unit 95 is transmitted to the terminal device 12 managed and operated by the dealer by contract. Note that since a process performed on the terminal device 12 is merely receiving and displaying the sales plan, rather than the harvest plan in the process described in relation to the flowchart of Figure 12, description thereof is omitted. Furthermore, in the processes of Steps S78 and S79, a configuration can be made whereby a dealer located near a cultivated land is selected from the dealers shown in figure 1 who serve as contractors for a series of services for each cultivated land, and the sales plan is transmitted to the selected dealer. In an illustrative case like this, for example, the sales plan can be transmitted to branches of a large retailer, such as a supermarket, and additionally, information on which a distribution plan is added can be transmitted to these. [00137] In Step S80, the control unit 81 controls the acquisition plan creation unit 96 such that an acquisition plan is created. As an illustrative specific example, the procurement plan creation unit 96 estimates a number of crops to be harvested from a range in which appropriate forecasted harvest times overlap on the managed management information according to IP addresses, and makes a schedule of procurement on how to proceed with procurement from the start day of harvest based on a desired quantity of production to purchase that is recorded in advance by the consumer by contract. [00138] In Step S81, control unit 81 controls communication unit 92 such that acquisition plan information generated by acquisition plan creation unit 96 is transmitted to terminal device 12, and the process returns for Step S61. Note that since a process performed on the terminal device 12 is merely receiving and displaying the acquisition plan, rather than the harvest plan in the process described in relation to the flowchart of Fig. 12, description thereof is omitted. Furthermore, in the processes of Steps S80 and S81, the acquisition plan for a cultivated land of the production to be acquired can be transmitted to each consumer, among the consumers shown in Figure 1, who are contractors of a series of services. Furthermore, the procurement plan can be generated according to a sales plan of a specific dealer, for example, which is located close to a consumer's location and transmitted to consumers who purchase production from the dealer. [00139] Furthermore, in Step S73, when the growth rate is not higher than the predetermined threshold value, the process proceeds to Steps S83 to S86. Note that since the processes of Steps S83 to S86 are the same processes of Steps S74, S76, S78 and S80, description of these is omitted. In other words, even when the growth rate is not higher than the threshold value predetermined in Step S73, configuration can be made that the harvest plan, distribution plan, sales plan and acquisition plan are created and, when there is an investigation in relation to each of the plans, a response can be made to each of the plans, and growth rate developments can be transmitted whenever developments occur. [00140] In other words, from the exposed process, parts of the growth situation information that are transmitted from the sensor cameras 11 at a predetermined time interval are sequentially accumulated. During this time, the growth rate and the appropriate predicted harvest time in the managed management information based on the IP addresses of the sensor cameras 11 are sequentially updated and recorded based on parts of the growth situation information. As a result, the growth rate and the appropriate predicted harvest time are updated to the latest information at a predetermined time interval, and when the appropriate predicted harvest time is approaching, the updated information can be transferred to the terminal devices. 12 managed and operated by each of the farmer, contract distributor, contract dealer and contract consumer in real time as harvest plan, distribution plan, sales plan and procurement plan information. Furthermore, since appropriate behaviors can be taken in accordance with the appropriate predicted harvest time by transferring information from the harvest plan, distribution plan, sales plan and procurement plan, efficient harvesting, distribution, sales and acquisition are possible. Furthermore, by comparing the RGB images in a time series manner, thereby detecting an abnormality in the sensor cameras 11, whether the sensor cameras 11 are properly installed or not can be monitored. Furthermore, since sensor cameras 11 can be used as a communication path even when a sensor camera 11 is stolen or moved to another position due to a storm, or when crops are stolen, an abnormality can be detected. by comparing the transmitted positional information with the positional information included in the management information. Furthermore, when the sensor cameras 11 can be used as a communication path even when a direction or an imaging angle of the sensor cameras 11 changes due to any cause, an abnormality can be considered to be occurring and detected through the same process. Note that the harvest plan, distribution plan, sales plan, and procurement plan can be transferred on the specified day as the appropriate forecasted harvest time, or one day a predetermined number of days before the specified day. <Investigation Response Process> [00141] Referring to Figure 14, an investigation response process will be described. Note that here, processes of transmitting the investigation information to make an investigation of a harvest time to the server 14 by the terminal device 12 managed and operated by the farmer, and receiving and displaying the response information will be described. Note that since the same process is performed when the contract reseller, contract consumer and contract distributor do the same investigation, this description is omitted. [00142] In other words, in Step S101, the control unit 61 of the terminal device 12 controls the operation unit 63 in such a way that whether or not an investigation operation is performed in accordance with an operation by a user is determined , and the same process is repeated until it is considered that there is an investigation operation. When it is considered that there is an investigation operation, in Step S101, the process proceeds to Step S102. [00143] In Step S102, the investigation unit 62 generates the investigation information to make an investigation of a harvest time associated with the IP address to identify the sensor cameras 11 that monitor crops grown on the farmer's cultivated land by stored contract in the storage unit of IP address 65. Note that when the distributor, dealer and contract consumer shown in figure 1 who receive a series of services conclude a contract for one service, not for each cultivated land (each IP address) , investigation unit 62 stores a relationship table in advance in which distributor, dealer and consumer business scopes are associated with IP addresses corresponding to sensor cameras 11, and generates investigation information that includes IP addresses according to distributor, reseller, and consumer business scopes. [00144] In Step S103, the control unit 61 controls the communication unit 64 in such a way that the investigation information generated by the investigation unit 62 to make a collection time investigation is transmitted to the server 14 via the network 13. [00145] In Step S121, the control unit 81 of the server 14 controls the investigation receiving unit 97 in such a way that whether or not the investigation information has been transmitted from the communication unit 92 is determined, and the same process is repeated until the information is transmitted. Furthermore, when investigation information is deemed to be transmitted, in Step S121, the process proceeds to Step S122. [00146] In Step S122, the control unit 81 controls the investigation receiving unit 97 in such a way that the investigation information transmitted from the communication unit 92 is acquired, and the investigation content is verified. [00147] In Step S123, the control unit 81 fetches the management information accumulated in the management information accumulation unit 88 based on the IP address included in the investigation information, and reads the information of an appropriate predicted harvest time and the information of the regions in the management information fetched. Here, when there are a plurality of sensor cameras 11 monitoring crops grown by the farmer, a plurality of IP addresses are included. Furthermore, here, the appropriate forecast harvest time information sought is based not only on a sensor camera 11 installed on the farmer's own cultivated land, but also on an IP address that specifies a designated sensor camera 11 by a user. [00148] In Step S124, the control unit 81 controls the mapping unit 90 in such a way that the appropriate forecasted harvest time information is mapped to positions corresponding to the information read from the regions according to a schedule and thereby Appropriate predicted harvest time mapping information is generated. [00149] In Step S125, the control unit 81 controls the response creation unit 98 in such a way that response information including the generated appropriate predicted harvest time mapping information is created. [00150] In Step S126, the control unit 81 controls the communication unit 92 in such a way that the response information including the appropriate forecast harvest time mapping information created is transmitted to the terminal device 12 that transmitted the research information. [00151] In Step S104, the control unit 61 of the terminal device 12 controls the communication unit 64 in such a way that the response information is received. [00152] In Step S105, the control unit 61 of the terminal device 12 causes the response information including the appropriate forecast harvest time mapping information received to be displayed on the display unit 66. [00153] From the processes described above, the appropriate predicted harvest time information can acquire the mapped information. Furthermore, with the mapping information displayed, the appropriate forecasted harvest time information can be transferred in real time on demand. Note that, in the above, the investigation response process performed by the terminal device 12 managed by the farmer has been described, but the same process can also be performed on the terminal devices 12 managed by the distributor by contract, by the reseller by contract and by the consumer. by contract. Furthermore, in the foregoing, the investigation content is the appropriate forecasted harvest time information, but even if the content is other information, for example, the harvest plan, the distribution plan, the sales plan, and the acquisition plan managed by server 14, an investigation can be made through the same process, or a response can be made to the investigation. Furthermore, it is possible that, for example, information mapping the appropriate harvest times expected of the crops is transmitted to the terminal devices 12 managed by the farmer, the distributor and the consumer, or dealers are classified for each agricultural production and information. mapping of predicted appropriate harvest times is transmitted to dealers. In other words, information on the appropriate expected harvest times, for example the name of the agricultural production, the time to harvest and the name of a branch to which it is distributed, can be transmitted to a large supermarket. [00154] Note that in the above example in which RGB images and a NIV image are captured by sensor cameras 11, information of the growth situation that includes the images is transmitted to the server 14, a growth rate of the RGB image and an NDVI image growth index that are computed by the server 14 and an appropriate predicted harvest time that is computed have been described. However, by providing the same function as server 14 in sensor cameras 11, configuration can be made whereby, by sensor cameras 11, a region in which agricultural production target undergoes imaging is specified, a growth rate of the RGB image and an NDVI image growth index of a region specified as the region of the RGB images and the NDVI image in which agricultural production undergoes imaging are computed, growth situation information that includes the indices is generated, and the information is supplied to the server 14. Furthermore, in addition to these functions, the sensor cameras 11 can capture a stereoscopic image in cooperation with another sensor camera 11 in the vicinity, and compute a growth rate of the stereoscopic image. Furthermore, 11 sensor cameras can compute a growth index and an appropriate predicted harvest time based on the RGB image growth index, NDVI image growth index and stereoscopic image growth index obtained as exposed. . In this illustrative case, when the appropriate predicted harvest time is computed, the sensor cameras 11 can read past growth situation information accumulated in the server 14, and compute the appropriate predicted harvest time also using the crop growth situation information. past. [00155] Furthermore, in the foregoing, the configuration example in which sensor cameras 11, terminal devices 12 and server 14 are included has been described for configuring the information processing system, however, a computer in cloud can be used instead of server 14. (2. First modification example) [00156] In the above, the example in which information captured by the sensor cameras 11 is demosailed to generate images of the RGB and NIV component signal was described, however, as illustrated in Figure 15, for example, using images P112 and P114 that include red light signals and NIV signals before being demosailed, an NDVI P132 image, as shown by a P132 image, can be generated. Since the demosaicing process can be omitted, or the number of pixels to be dealt with can be reduced with this operation, a processing load can be decreased and a processing speed can be increased. Note that since images P111, P113, P121 to P123, and P131 are the same as images P11, P13, P21 to P23, and P31 of figure 3, description of these will be omitted. (3. Second modification example) [00157] Furthermore, in the above, the example in which pixels of the RGB and NIV component signals are arranged in a direction of a plane of the sensor cameras 11 was described, however, as illustrated in figure 16, for example, the sensor 31 can be configured by laminating the sensor layers perpendicular to a light travel direction to image the component signal. In other words, in figure 16, a blue light sensor layer L1, a green light sensor layer L2, a red light sensor layer L3 and a NIV sensor layer L4 are configured to be laminated from above. of the drawing, as shown by a P151 image. Each layer has a sensor structure in which only one color component with a target wavelength is detected. As a result, images P161 to P163 that include green light, red light, blue light and NIV component signal images of images P161 to P164 of each layer are generated. As a result, an RGB image P171 is generated from images P161 to P163, and an NDVI image P172 is generated from images P162 and P164. (4. Third modification example) [00158] Furthermore, in relation to the sensor 31 that detects RGB component signals, configuration can be made in which, for example, an IR F-cut filter that includes a dielectric laminated film, for example, a laminated film made of SiO or SiN, as illustrated in the left part of figure 17, is provided under RGB FR, FG and FB color filters, as illustrated in the right part of figure 17, so that the sensor that detects components of the RGB signal does not detect NIV , and the IR cut filter F is not provided only under a black filter (visible light cut) FA of an NIV sensor. Note that the right part of figure 17 is a two-pixel x two-pixel perspective appearance diagram of sensor 31, and the left part of the drawing is an enlarged cross-sectional diagram of the IR F cut filter, which shows that infrared light IR is blocked by the IR cut filter and only T light other than infrared IR light transmits the sensor. Note that the FA black filter can be configured not to include a color filter. (5. Fourth modification example) [00159] Furthermore, the sensor 31 that detects NIV component signals can be configured such that the IR F cut filters are provided under RGB FC color filters and above the SC sensors, as illustrated, for example, in figure 18, so that a sensor that detects components of the RGB signal does not detect NIV, and the IR F cut filters are not provided just above the NIV SC sensors. Note that figure 18 is a cross-sectional diagram for four pixels of sensor 31, showing a configuration of a pixel P1 for NIV, a pixel P2 for light other than NIV, a pixel P3 for NIV and a pixel P4 for light other than NIV. arranged from the left of the drawing. [00160] Note that in the above, the example in which an NDVI image is generated based on the RGB signal components and the NIV signal components and an NDVI image growth index obtained from the generated NDVI image is used has been described , however, other growth indices can be used, as long as the growth indices are obtained based on the components of the RGB signal and the components of the NIV signal. So, configuration can be made where instead of the NDVI image, for example, a Simple Ratio (SR) image, a Global Environment Monitoring Index (GEMI) image, a Soil Adjusted Vegetation Index image ( SAVI), an Enhanced Vegetation Index (EVI) image, a Perpendicular Vegetation Index (PVI) image, a Photochemical Reflectance Index (PRI) image, a Structure Insensitive Pigment Index (SIPI) image, a Plant Sensing Reflectance Index (PSRI) image, a Chlorophyll Index (CI) image, a Modified Simple Ratio (mSR) image, a Modified Normalized Difference (mND) image, a Chlorophyll Index image from Cup (CCI), a Water Index (WI) image, a Normalized Difference Water Index (NDWI) image, a Cellulose Absorption Index (CAI) image, a Ratio Vegetation Index (RVI) image ), an image of the Vegetation Index Type (KVI) and an image of the Vege Index Difference Measurement (DVI), which are obtained based on the components of the RGB signal and the components of the NIV signal, are used and the growth indices corresponding to the images are computed and used. Furthermore, by combining a plurality of image types obtained based on the components of the RGB signal and the components of the NIV signal, an image growth index of the combined image can be obtained and used. [00161] The series of processes described above can be executed by hardware or by software. When the series of processes is performed by software, a program constituting the software is installed on a computer embedded in dedicated hardware or, for example, a general purpose personal computer that can perform various functions by installing various programs itself from of a recording medium. [00162] Figure 19 illustrates an illustrative configuration example of a general-purpose personal computer. The personal computer has a Central Processing Unit (CPU) 1001 built into it. The CPU 1001 is connected to an input and output interface 1005 via a bus 1004. A Read Only Memory (ROM) 1002 and a Random Access Memory (RAM) 1003 are connected to the bus 1004. [00163] An input unit 1006 that is configured by an input device, such as a keyboard or mouse, by which a user enters operating commands, an output unit 1007 that transmits process operation screens or result images of the process in a display device, a storage unit 1008 that is configured by a hard disk drive on which programs and various types of data are stored, and a communication unit 1009 that is configured by a local area network adapter ( LAN) and perform communication processes over a network represented by the Internet are connected to the input and output interface 1005. Furthermore, a disk drive 1010 that reads and writes data to removable media 1011, such as a magnetic disk (including a floppy disk), an optical disk (including a Read Only Memory on Compact Disk (CD-ROM) or a Digital Versatile Disk (DVD)), a magneto-optical disk (including a Mini Disk co (MD)) or a semiconductor memory, is connected to it. [00164] CPU 1001 executes various processes according to programs stored in ROM 1002 or programs read from removable media 1011, such as a magnetic disk, an optical disk, a magneto-optical disk or a semiconductor memory, and installed in the storage unit 1008 and loaded into RAM 1003 from storage unit 1008. RAM 1003 also properly stores data for CPU 1001 to run various processes. Such data may be necessary for the CPU 1001 to run various processes. [00165] A computer configured as described above performs the above-described series of processes when the CPU 1001, for example, causes programs stored in storage unit 1008 to be loaded into RAM 1003 and executes the programs via the input interface and output 1005 and bus 1004. [00166] Programs executed by the computer (the CPU 1001) can be provided by writing to removable media 1011, eg a media pack. Furthermore, the programs may be provided via a wired or wireless transmission medium, such as a local area network or the Internet, or in the form of digital satellite broadcasting. [00167] Programs can be installed on storage drive 1008 of the computer via input/output interface 1005 by loading removable media 1011 into disk drive 1010. Furthermore, programs can be installed on storage drive 1008 by reception by the communication unit 1009 by means of a wired or wireless transmission medium. Furthermore, programs can be installed in ROM 1002 or storage unit 1008 in advance. [00168] Note that the programs executed by the computer may be programs by which the processes are carried out in a time series manner in the order described here, or it may be a program by which the processes are carried out, for example, in parallel or in synchronisms, such as when there is a call. [00169] Furthermore, in the present specification, a system means a set of a plurality of constituent elements (such as devices or modules (components)), and it does not matter that all the constituent elements are accommodated in the same housing. Therefore, a plurality of devices that are accommodated in individual housings and connected to each other via a network and a device of which a plurality of modules are accommodated in one housing fall into a system. [00170] Note that modalities of the technology currently described are not limited to the modalities described above, and can be variably modified in scope without departing from the essence of the technology currently described. [00171] For example, the technology currently described may adopt the cloud computing configuration in which a function is divided and shared by a plurality of devices over a network. [00172] Furthermore, the execution of each step described in the exposed flowcharts can be done by one device or in a divided way by a plurality of devices. [00173] Additionally, when a plurality of processes are included in a step, the execution of the plurality of processes included in the step can be done by one device or in a divided way by a plurality of devices. [00174] Note that the technology currently described can also adopt the following configurations. (A-1) A method, comprising: obtaining image information from an organism comprising an optical data set; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth index, wherein the image information comprises at least one of: (a) visible image data obtained from an image sensor and non-visible image data obtained from of the image sensor, and (b) a set of image data from at least two image capture devices, wherein the at least two image capture devices capture the image data set from at least two positions. (A-2) The method, from (A-1), which further comprises transferring the expected harvest time to an external part. (A-3) The method, from (A-2), where the outside party is at least one of a retailer, a general consumer, a restaurant, and a food producer. (A-4) The method, of any of (A-1) - (A-3), in which the visible image data is generated based on a demosailed RGB pixel signal. (A-5) The method, from any of (A-1) - (A-4), in which the non-visible image data is generated based on a demosailed V and IV signal. (A-6) The method, of any of (A-1) - (A-4), in which the non-visible image data is generated based on a V and IV signal without demosaicing. (A-7) The method, of any one of (A-1) - (A-3), in which the visible image data is generated based on a demosailed RGB pixel signal, and in which the image data near-infrared rays are generated based on a demosailed V and IR signal. (A-8) The method, of any of (A-1) - (A-3), in which the visible image data is generated based on a demosailed RGB pixel signal, and in which the image data near-infrared ray are generated based on a V and IR signal without demosaicing. (A-9) The method, of any one of (A-1) - (A-8), wherein the optical data set is obtained using a battery-type image sensor, wherein the battery-type image sensor has a blue light sensor layer stacked on a green light sensor layer, where the green light sensor layer is stacked on a red light sensor layer, and where the red light sensor layer is stacked on a near-infrared ray (NIV) sensor layer. (A-10) The method, of any one of (A-1) - (A-8), wherein the optical data set is obtained using an image sensor comprising a set of RGB color filters provided over an laminated film, wherein the laminated film comprises at least one of SiO and SiN, and wherein the RGB color filter set comprises an FR color filter, an FG color filter, and an FB color filter. (A-11) The method, of any one of (A-1) - (A-8), wherein the optical data set is obtained using an image sensor comprising a set of RGB color filters provided over an infrared (IF) cut filter set, wherein the IR cut filter set is provided over a set of image sensors. (A-12) The method, of any one of (A-1) - (A-11), which further comprises calculating parallax image data based on at least two of the image data from the image data set obtained from the at least two image capture devices; and calculate the growth rate based on the parallax image data, where at least two of the image data are captured from at least one of: two angles and at least two positions by at least two image capture devices Image. (A-13) A system, comprising: an image capture device, wherein at least one of the server and image capture device is configured to: obtain image information from an organism comprising an optical data set ; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth index, wherein the image information comprises at least one of: (a) visible image data obtained from an image sensor and non-visible image data obtained from of the image sensor, and (b) a set of image data from at least two image capture devices, wherein the at least two image capture devices capture the image data set from at least two positions. (A-14) The system of (A-13), which additionally comprises a server, wherein the image capture device is in communication with the server. (A-15) The system, from (A-14), wherein the at least one of the server and the image capture device is additionally configured to transfer the expected harvest time to an external party. (A-16) The system, of any of (A-13) - (A-15), where visible image data is generated based on a demosailed RGB pixel signal. (A-17) The system, of any of (A-13) - (A-16), in which the non-visible image data is generated based on a demosailed V and IV signal. (A-18) The system, of any of (A-13) - (A-16), in which the non-visible image data is generated based on a V and IV signal without demosaicing. (A-19) The system, of any one of (A-13) - (A-18), wherein the optical data set is obtained using a battery-type image sensor, wherein the battery-type image sensor has a blue light sensor layer stacked on a green light sensor layer, where the green light sensor layer is stacked on a red light sensor layer, and where the red light sensor layer is stacked on a near-infrared ray sensor layer. (A-20) The system, of any one of (A-13) - (A-18), wherein the optical data set is obtained using an image sensor comprising a set of RGB color filters provided over an laminated film, wherein the laminated film comprises at least one of SiO and SiN, and wherein the RGB color filter set comprises an FR color filter, an FG color filter, and an FB color filter. (A-21) The system, of any one of (A-13) - (A-18), wherein the optical data set is obtained using an image sensor comprising a set of RGB color filters provided over an infrared (IF) cut filter set, wherein the IR cut filter set is provided over a set of image sensors. (A-22) The system, of any of (A-13) - (A-21), wherein at least one of the server and the image capture device is additionally configured to: calculate image data with parallax based on at least two of the image data from the image data set obtained from the at least two image capture devices; and calculate the growth rate based on the parallax image data, where at least two of the image data are captured from at least one of: two angles and at least two positions by at least two image capture devices Image. (A-23) A non-temporary, tangible computer-readable media that has stored therein instructions that cause a processor to perform a method, the method comprising: obtaining image information from an organism comprising an optical data set; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth index, wherein the image information comprises at least one of: (a) visible image data obtained from an image sensor and non-visible image data obtained from of the image sensor, and (b) a set of image data from at least two image capture devices, wherein the at least two image capture devices capture the image data set from at least two positions. (A-24) The computer readable media of (A-23), wherein the method further comprises transferring the anticipated harvest time to an external party. (A-25) The computer readable media, of (A-24), where the outside is at least one from a retailer, a general consumer, a restaurant, and a food producer. (A-26) Computer readable media, of any of (A-23) - (A-25), where visible image data is generated based on a demosailed RGB pixel signal. (A-27) The computer-readable media of any of (A-23) - (A-26) in which the non-visible image data is generated based on a demosailed V and IV signal. (A-28) The computer readable media of any one of (A-23) - (A-27), wherein the method further comprises calculating parallax image data based on at least two of the image data coming from the set of image data obtained from the at least two image capture devices; and calculate the growth rate based on the parallax image data, where at least two of the image data are captured from at least one of: two angles and at least two positions by at least two image capture devices Image. [00175] Furthermore, please note that the technology currently described may also adopt the following configurations. (B-1) An information processing system that includes an imaging unit that captures an image of agricultural production as an RGB image and a near-infrared ray (NIV) image, a specification unit that specifies a region of the image in which an object, which is agricultural production, undergoes image processing and a growth index computation unit that computes an agricultural production growth index based on a growth index image obtained from the RGB image , from the NIV image and from a red image from the RGB image of the region in the image that is specified by the specification unit and in which the object undergoes imaging. (B-2) The information processing system, described in (B-1) above, wherein the growth index image is any one of, or a combination of, a Normalized Difference Vegetation Index (NDVI) image ), a Simple Ratio (SR) image, a Global Environment Monitoring Index (GEMI) image, a Soil Adjusted Vegetation Index (SAVI) image, an Enhanced Vegetation Index (EVI) image, an of the Perpendicular Vegetation Index (PVI), an image of the Photochemical Reflectance Index (PRI), an image of the Structure Insensitive Pigment Index (SIPI), an image of the Plant Sensing Reflectance Index (PSRI), an image of the Chlorophyll Index (CI), a Modified Simple Ratio (mSR) image, a Modified Normalized Difference (mND) image, a Cup Chlorophyll Index (CCI) image, a Water Index (WI) image, a image of the Normalized Difference Water Index (NDWI), an image of the Absorption Index (CAI), a Ratio Vegetation Index (RVI) image, a Vegetation Index Type (KVI) image, and a Difference Vegetation Index (DVI) image. (B-3) The information processing system, described in (B-1) above, in which the image processing unit is configured to include image sensors for each color of the RGB image and an image sensor for NIV. (B-4) The information processing system, described in (B-3) above, wherein the image processing unit has a flat array of pixels that have colors for the RGB and NIV image. (B-5) The information processing system, described in (B-3) above, in which the image processing unit has pixels that have colors for the RGB and NIV image that are arranged to be laminated in a direction of the light displacement. (B-6) The information processing system, described in (B-1) above, which additionally includes a growth rate image computing unit that computes the agricultural production growth rate image based on the red image and in the NIV image of the region in the image that is specified by the specification unit and in which the object, which is agricultural production, undergoes imaging, and in which the growth index computing unit computes a growth index of the agricultural production based on the growth rate image computed by the growth rate image computing unit. (B-7) The information processing system, described in (B-6) above, wherein the growth rate image computing unit computes the growth rate image from the reflectance of near-infrared rays obtained with based on the red image and the NIV image of the region in the image that is specified by the specification unit and in which the object, which is agricultural production, undergoes image treatment, and an agricultural production growth index is computed based on the mean, variance, or high-order variance of the growth index image. (B-8) The information processing system, described in (B-1) above, which additionally includes an RGB image growth index computing unit that computes an RGB image growth index of agricultural production based on the RGB images of the region in the image that is specified by the specification unit and where the object, which is agricultural production, undergoes imaging, and where the growth index computing unit computes an agricultural production growth index based on the RGB image growth rate computed by the RGB image growth rate computing unit. (B-9) The information processing system, described in (B-8) above, wherein the RGB image growth rate computing unit computes an RGB image growth rate from a ratio of one color predetermined in the RGB image of the region in the image that is specified by the specification unit and in which the object, which is agricultural production, undergoes image treatment. (B-10) The information processing system, described in (B-1) above, which further includes a parallax image growth rate computing unit that computes a parallax image growth rate based on a parallax image obtained from at least two images obtained by capturing the same object, which is agricultural production, from different angles, which are RGB images of the region in the image that is specified by the specification unit and in which the object , which is agricultural production, undergoes imaging, and where the growth index compute unit computes an agricultural production growth index based on the parallax image growth index computed by the growth index compute unit image growth with parallax. (B-11) The information processing system, described in (B-10) above, wherein the parallax image growth rate computing unit computes the parallax image growth rate from the output size agricultural production which is estimated based on the distance to agricultural production in one direction of the image processing, the size being computed based on the parallax image obtained from at least two images obtained by capturing the same object, which is agricultural production , from different angles, which are RGB images of the region in the image that is specified by the specification unit and in which the object, which is agricultural production, undergoes image treatment. (B-12) The information processing system described in (B-1) above, which additionally includes a storage unit that stores a position of the image processing unit, an image captured by the image processing unit, a date and time of capture of the image captured by the imaging unit, and a growth index of each agricultural production captured by the imaging unit as management information in association with information to identify the imaging unit and a harvest time computation unit that computes an appropriate predicted harvest time of agricultural produce based on the growth rate of each agricultural produce stored in the storage unit and the relationship of a growth rate and a harvest time of each agricultural produce the past, and wherein the storage unit also stores appropriate forecasted harvest time information computed by the collection time computing unit in association with information to identify the imaging unit. (B-13) The information processing system, described in (B-12) above, wherein a sensor provided with the image processing unit, a server that manages the storage unit that stores the management information, and a terminal device that makes an inquiry for a harvest time on the server are included, and when an appropriate forecasted harvest time inquiry is received from the terminal device, the server generates response information that includes the predicted harvest time appropriate based on the management information stored in the storage unit in response to the investigation of the appropriate predicted harvest time based on the management information stored in the storage unit, and transmits the response information to the terminal device. (B-14) An information processing method of an information processing system, which includes: capturing an image of agricultural produce as an RGB image and a near-infrared ray (NIV) image; specify an image region in which an object, which is agricultural production, undergoes image treatment; and compute an agricultural production growth index based on an image of the growth index obtained from the RGB image, the NIV image, and a red image from the RGB image of the region in the image that is specified in the specification and where the object undergoes image processing. (B-15) A program that causes a computer that controls an information processing system to perform: capture an image of agricultural production as an RGB image and a near-infrared ray (NIV) image; specification of an image region in which an object, which is agricultural production, undergoes image treatment; and computing an agricultural production growth rate based on an image of the growth rate obtained from the RGB image, the NIV image, and a red image from the RGB image of the region in the image that is specified in the specification and where the object undergoes image processing. (B-16) An imaging device, which includes an imaging unit that captures an image of agricultural produce as an RGB image and a near-infrared ray (NIV) image, a specification unit that specifies a region in the image in which an object, which is agricultural production, undergoes image processing and a growth rate computing unit that computes a growth rate of agricultural production based on a growth rate image obtained from the RGB image, from the NIV image, and from a red image from the RGB image of the region in the image that is specified by the specification unit and in which the object undergoes imaging. (B-17) The imaging device described in (16) above, wherein the growth index image is any one of, or a combination of, a Normalized Difference Vegetation Index (NDVI) image, a Simple Ratio (SR) image, a Global Environment Monitoring Index (GEMI) image, a Soil Adjusted Vegetation Index (SAVI) image, an Enhanced Vegetation Index (EVI) image, an Index image Vegetation Index (PVI), a Photochemical Reflectance Index (PRI) image, a Structure Insensitive Pigment Index (SIPI) image, a Plant Sensing Reflectance Index (PSRI) image, a Plant Sensing Reflectance Index (PSRI) image Chlorophyll (CI), a Modified Simple Ratio (mSR) image, a Modified Normalized Difference (mND) image, a Cup Chlorophyll Index (CCI) image, a Water Index (WI) image, a Normalized Difference Water Index (NDWI), an image of the C Absorption Index elulose (CAI), a Ratio Vegetation Index (RVI) image, a Vegetation Index Type (KVI) image, and a Difference Vegetation Index (DVI) image. (B-18) The imaging device, described in (16) above, wherein the imaging unit is configured to include image sensors for each color of the RGB image and an image sensor for a near-infrared ray . (B-19) A method of image processing, which includes: capturing an image of agricultural production as an RGB image and a near-infrared ray (NIV) image; specify a region in the image in which an object, which is agricultural production, undergoes image treatment; and compute an agricultural production growth index based on an image of the growth index obtained from the RGB image, the NIV image, and a red image from the RGB image of the region in the image that is specified in the specification and where the object undergoes image processing. (B-20) A program that causes a computer that controls an imaging device to perform: capture an image of agricultural produce as an RGB image and a near-infrared ray (NIV) image; specification of a region in the image in which an object, which is agricultural production, undergoes image treatment; and computing an agricultural production growth rate based on an image of the growth rate obtained from the RGB image, the NIV image, and a red image from the RGB image of the region in the image that is specified in the specification and where the object undergoes image processing. [00176] As used herein, "at least one", "one or more", and "and/or" are open expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions "at least one of A, B and C", "at least one of A, B or C", "one or more of A, B and C", "one or more of A, B or C" and "A, B and/or C" mean A only, B only, C only, A and B together, A and C together, B and C together or A, B and C together. [00177] It is noticed that the terms "a" or "a" entities refer to one or more of this entity. As such, the terms "a" (or "an"), "one or more" and "at least one" may be used interchangeably herein. It should also be noted that the terms "comprising", "including" and "having" may be used interchangeably. [00178] The terms "determines", "calculates" and "computes", and variations thereof, as used herein, are used interchangeably and include any type of methodology, process, mathematical or technical operation. [00179] The term "computer readable media", as used herein, refers to any tangible storage and/or transmission media that participate in providing instructions to a processor for execution. Media such as this can take many forms, including, but not limited to, non-volatile media, volatile media, and broadcast media. Non-volatile media includes, for example, NVRAM, or magnetic or optical disks. Volatile media includes dynamic memory, such as main memory. Common forms of computer readable media include, for example, a floppy disk, floppy disk, hard disk, magnetic tape or any other magnetic media, magneto-optical media, a CD-ROM, any other optical media, punch cards, tape paper, any other physical media with hole patterns, a RAM, a PROM and EPROM, a FLASH EPROM, a solid state media such as a memory card, any other memory chips or cartridge, a carrier wave described below , or any other media that a computer can read from. A digital file attached in electronic mail or other self-contained file or set of information files is considered a distribution medium equivalent to a tangible storage medium. When computer-readable media is configured as a database, it should be understood that the database can be any type of database, such as relational, hierarchical, object-oriented, and/or the like. Accordingly, the invention is considered to include tangible storage media or distribution media, and recognized prior art equivalents and successor media, on which the software implementations of the present invention are stored. [00180] The term "module", as used herein, refers to any hardware, software, embedded software, artificial intelligence, fuzzy logic, or combination of known or later developed hardware and software that is capable of performing the functionality associated with this element. Also, while the invention is described in terms of exemplary embodiments, it should be understood that individual aspects of the invention may be separately claimed. [00181] It should be understood by those skilled in the art that various modifications, combinations, subcombinations and alterations may occur depending on design requirements and other factors, to the extent they fall within the scope of the appended claims or equivalents thereof. Reference Signal List 11 , 11-1 to 11-N Sensor camera 12 , 12-1 to 12-4 Terminal device 13 Network 14 Server 31 Sensor 32 RGB imaging unit 33 NDVI imaging unit 34 Control unit 35 IP address storage unit 36 GPS 37 Environment information measurement unit 38 RTC 39 Growth situation information generation unit 40 Communication unit 41 Communication path specification unit 61 Control unit 62 Unit Investigation unit 63 Operation unit 64 Communication unit 65 IP address storage unit 66 Display unit 81 Control unit 82 Growth situation information accumulation unit 83 Target region specification unit 84 Growth rate computing unit 85 NDVI Image Growth Index Computing Unit 86 Stereoscopic Image Growth Index Computing Unit 87 Compute Unit Harvest time utilization 88 Management information accumulation unit 89 Growth index computing unit 90 Mapping unit 91 Sensor camera operation status monitoring unit 92 Communication unit 93 Harvest plan creation unit 94 Distribution plan creation unit 95 Sales plan creation unit 96 Acquisition plan creation unit 97 Investigation reception unit 98 Response creation unit
权利要求:
Claims (21) [0001] 1. Method, characterized by the fact that it comprises: obtaining image information from an organism that comprises a set of optical data; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth rate, wherein the image information comprises a set of image data from the at least two image capture devices, wherein the at least two image capture devices capture the image data set from at least two positions and at least two different angles, wherein the optical data set is obtained from an image sensor comprising an RGB color filter set provided on top of a filter set Infrared cut (IF), in which the set of infrared cut filters is provided on a set of RGB sensors, in which the image sensor also comprises a black filter provided on a near-infrared ray (NIV) sensor without an infrared filter. infrared slice provided over the NIV sensor and then without NIV blocking from the NIV sensor and where an image captured from the RGB sensors and the NIV sensor is representative of the organ even. [0002] 2. Method according to claim 1, characterized in that it additionally comprises transferring the expected harvest time to an external part. [0003] 3. Method according to claim 2, characterized in that the external party is at least one of a retailer, a general consumer, a restaurant and a food producer. [0004] 4. Method according to claim 1, characterized in that the image information comprises visible image data obtained from an image sensor and non-visible image data obtained from the image sensor, wherein the data of the visible image are generated based on a demosailed RGB pixel signal to provide R, G and B image signal components and wherein the non-visible image data is generated based on a demosified V and IV signal. [0005] 5. Method according to claim 1, characterized in that the image information comprises visible image data obtained from an image sensor and non-visible image data obtained from the image sensor, wherein the data of the visible image are generated based on an RGB pixel signal demosailed to provide R, G and B image signal components and wherein the non-visible image data is generated based on a V and IV signal without demosification. [0006] Method according to claim 1, characterized in that the optical dataset is obtained using a stack-type image sensor, wherein the stack-type image sensor has a blue light sensor layer stacked in a layer. light sensor layer, where the green light sensor layer is stacked on a red light sensor layer, and where the red light sensor layer is stacked on a near-infrared ray (NIV) sensor layer . [0007] 7. Method according to claim 1, characterized in that the optical data set is obtained using an image sensor comprising a set of RGB color filters provided on a laminated film, wherein the laminated film comprises at least one of SiO and SiN, and wherein the RGB color filter set comprises an FR color filter, an FG color filter and an FB color filter. [0008] 8. Method according to claim 1, characterized in that it additionally comprises calculating parallax image data based on at least two of the image data from the image data set obtained from the at least two capture devices of image; and calculate the growth rate based on the parallax image data. [0009] 9. System, characterized in that it comprises: an image capture device, in which at least one of the server and image capture device is configured to: obtain image information from an organism comprising a set of optical data ; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth rate, wherein the image information comprises a set of image data from the at least two image capture devices, wherein the at least two image capture devices capture the set of image data from at least two positions and at least two different angles, wherein the set of optical data is obtained from an image sensor comprising a set of RGB color filters provided on a set of infrared cut filters (IF), wherein the set of infrared cut filters is provided on a set of RGB sensors, wherein the image sensor further comprises a black filter provided on a near-infrared ray (NIV) sensor without a filter IR cutout provided over the NIV sensor and then without NIV blocking from the NIV sensor and where an image captured from the RGB sensors and the NIV sensor is representative of the org anism. [0010] 10. System according to claim 9, characterized in that it additionally comprises a server, in which the image capture device is in communication with the server. [0011] 11. System according to claim 10, characterized in that the at least one of the server and the image capture device is additionally configured to transfer the expected harvest time to an external part, and wherein the at least one of the server is additionally configured to notify the external party when the expected harvest time is reached. [0012] 12. System according to claim 9, characterized in that the image information comprises visible image data obtained from an image sensor and non-visible image data obtained from the image sensor, wherein the data of the visible image are generated based on a demosailed RGB pixel signal to provide R, G and B image signal components and wherein the non-visible image data is generated based on a demosified V and IV signal. [0013] 13. System according to claim 9, characterized in that the image information comprises visible image data obtained from an image sensor and non-visible image data obtained from the image sensor, wherein the data of the visible image are generated based on an RGB pixel signal demosailed to provide R, G and B image signal components and wherein the non-visible image data is generated based on a V and IV signal without demosification. [0014] 14. System according to claim 9, characterized in that the optical dataset is obtained using a stack-type image sensor, wherein the stack-type image sensor has a blue light sensor layer stacked in a layer. light sensor layer, where the green light sensor layer is stacked on a red light sensor layer, and where the red light sensor layer is stacked on a near-infrared ray sensor layer. [0015] 15. System according to claim 9, characterized in that the optical data set is obtained using an image sensor comprising a set of RGB color filters provided on a laminated film, wherein the laminated film comprises at least one of SiO and SiN, and wherein the RGB color filter set comprises an FR color filter, an FG color filter and an FB color filter. [0016] 16. System according to claim 9, characterized in that at least one of the server and the image capture device is additionally configured to: calculate parallax image data based on at least two of the image data coming from the set of image data obtained from the at least two image capture devices; and calculate the growth rate based on the parallax image data. [0017] 17. Non-temporary, tangible computer-readable media that have, stored in them, instructions that cause a processor to execute a method, characterized in that the method comprises: obtaining image information from an organism that comprises a set of optical data ; calculate a growth rate based on the optical data set; and calculating a predicted harvest time based on the growth rate, wherein the image information comprises a set of image data from the at least two image capture devices, wherein the at least two image capture devices capture the set of image data from at least two positions and at least two different angles, wherein the set of optical data is obtained from an image sensor comprising a set of RGB color filters provided on a set of infrared cut filters (IF), wherein the set of infrared cut filters is provided on a set of RGB sensors, wherein the image sensor further comprises a black filter provided on a near-infrared ray (NIV) sensor without a filter cutout provided over the NIV sensor and then without NIV blocking from the NIV sensor and where an image captured from the RGB sensors and the NIV sensor is representative of the org anism. [0018] 18. Computer readable media according to claim 17, characterized in that the method additionally comprises transferring the expected harvest time to an external party. [0019] 19. Computer readable media according to claim 17, characterized in that the external party is at least one of a retailer, a general consumer, a restaurant and a food producer. [0020] 20. Computer readable media according to claim 17, characterized in that the image information comprises visible image data obtained from an image sensor and non-visible image data obtained from the image sensor, in whereas the visible image data is generated based on a demosailed RGB pixel signal to provide R, G and B image signal components and whereas the non-visible image data is generated based on a demosailed V and IV signal. [0021] 21. Computer readable media according to claim 17, characterized in that the method further comprises calculating parallax image data based on at least two of the image data from the image data set obtained from the at least two at least two image capture devices; and calculate the growth rate based on the parallax image data.
类似技术:
公开号 | 公开日 | 专利标题 BR112015024105B1|2022-01-04|METHOD, SYSTEM, AND, NON-TEMPORARY TANGIBLE COMPUTER-LEABLE MEDIA Iglhaut et al.2019|Structure from motion photogrammetry in forestry: A review Torres-Sánchez et al.2018|Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards Chen et al.2019|Strawberry yield prediction based on a deep neural network using high-resolution aerial orthoimages Kirk et al.2009|Estimation of leaf area index in cereal crops using red–green images US20200226375A1|2020-07-16|Method and system for crop recognition and boundary delineation Fretwell et al.2017|Using super‐high resolution satellite imagery to census threatened albatrosses Ruiz-Luna et al.1999|Modifications in coverage patterns and land use around the Huizache-Caimanero lagoon system, Sinaloa, Mexico: a multi-temporal analysis using LANDSAT images Marín et al.2018|Urban lawn monitoring in smart city environments JP6187639B2|2017-08-30|Imaging apparatus, imaging method, and program Quirós Vargas et al.2019|Phenotyping of plant biomass and performance traits using remote sensing techniques in pea | Delgado et al.2013|Interannual changes in the habitat area of the black-necked swan, Cygnus melancoryphus, in the Carlos Anwandter Sanctuary, southern Chile: a remote sensing approach Guo et al.2021|UAS-based plant phenotyping for research and breeding applications Fan et al.2018|Low‐cost visible and near‐infrared camera on an unmanned aerial vehicle for assessing the herbage biomass and leaf area index in an Italian ryegrass field Yuan et al.2021|UAV-Based heating requirement determination for frost management in apple orchard KR102300799B1|2021-09-10|Crop state judgment apparatus and crop cultivation information service platform including the same Miller et al.2020|Analyzing crop health in vineyards through a multispectral imaging and drone system Gorressen et al.2015|Behavior of the Hawaiian hoary bat | at wind turbines and its distribution across the North Koolau mountains, O'ahu Imeraj et al.2021|Risk Assessment of Forests Probed Using UAV Integrated Computing Remelgado et al.2020|From ecology to remote sensing: using animals to map land cover Pyle et al.2017|Development of Protocols for Monitoring Phenology and Abundance of Berries Important to Brown Bear in the Kodiak Archipelago, Alaska Chong et al.2021|Introducing Theil-Sen estimator for sun glint correction of UAV data for coral mapping Campos-Vargas2018|Assessing the Capabilities of a Multispectral Unmanned Aerial System | at the Santa Rosa Environmental Monitoring Super Site, Costa Rica Soares et al.2021|Cattle counting in the wild with geolocated aerial images in large pasture areas Eng2021|An Evaluation of Unmanned Aircraft System | as a Practical Tool for Salt Marsh Restoration Monitoring, San Francisco Bay, CA
同族专利:
公开号 | 公开日 AU2014245679A1|2015-08-20| US20200356775A1|2020-11-12| US10607078B2|2020-03-31| EP2979079A1|2016-02-03| US20200210699A1|2020-07-02| RU2659883C2|2018-07-04| JP5950166B2|2016-07-13| TWI615605B|2018-02-21| KR102227077B1|2021-03-11| AU2014245679B2|2017-04-13| CN111929253A|2020-11-13| BR112015024105A2|2017-07-18| JP2014183788A|2014-10-02| KR20220010066A|2022-01-25| KR102362910B1|2022-02-15| TW201437626A|2014-10-01| US20160283791A1|2016-09-29| KR20160018450A|2016-02-17| WO2014156039A1|2014-10-02| CN105308436A|2016-02-03| KR20210011514A|2021-02-01| CN105308436B|2020-09-18| RU2015139842A|2017-03-23|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JPS5931614A|1982-08-11|1984-02-20|Kubota Ltd|Fruit harvesting apparatus| JP3093486B2|1992-10-26|2000-10-03|株式会社テクノ・グラフティング研究所|Plant quality control device and area counting method for quality determination| US6160902A|1997-10-10|2000-12-12|Case Corporation|Method for monitoring nitrogen status using a multi-spectral imaging system| WO2000077861A1|1999-06-14|2000-12-21|Augusto Carlos J R P|Stacked wavelength-selective opto-electronic device| WO2003069315A1|2002-02-13|2003-08-21|Kansai Environmental Engineering Center Co., Ltd.|Method of estimating biomass of forests and trees by remote sensing high-resolution data| US8712144B2|2003-04-30|2014-04-29|Deere & Company|System and method for detecting crop rows in an agricultural field| KR20050043368A|2003-11-06|2005-05-11|신성식|The method for verifying circulation of eco-friendly agricultural products, and it's system| WO2005091290A1|2004-03-23|2005-09-29|Matsushita Electric Industrial Co., Ltd.|Recording device, reproduction device, host device, drive device, recording method, reproduction method, program, and information recording medium| US7702597B2|2004-04-20|2010-04-20|George Mason Intellectual Properties, Inc.|Crop yield prediction using piecewise linear regression with a break point and weather and agricultural parameters| SE0402576D0|2004-10-25|2004-10-25|Forskarpatent I Uppsala Ab|Multispectral and hyperspectral imaging| US20060167926A1|2005-01-27|2006-07-27|James Verhey|Vineyard information collection and management system| JP2006250827A|2005-03-11|2006-09-21|Pasuko:Kk|Analytical method for growth condition of crop| US7570783B2|2005-07-01|2009-08-04|Deere & Company|Method and system for vehicular guidance using a crop image| JP4873545B2|2006-05-16|2012-02-08|株式会社日立ソリューションズ|Field management support method and system| JP4396684B2|2006-10-04|2010-01-13|ソニー株式会社|Method for manufacturing solid-state imaging device| US8208680B2|2006-11-07|2012-06-26|The Curators Of The University Of Missouri|Method of predicting crop yield loss due to N-deficiency| WO2008074019A2|2006-12-13|2008-06-19|Georgia Tech Research Corporation|Systems and methods for real time multispectral imaging| DK2129212T3|2007-03-23|2016-03-29|Heliospectra Aktiebolag|A system for modulating plant growth or attributes| US7911517B1|2007-10-31|2011-03-22|The United States Of America As Represented By The Secretary Of Agriculture|Device and method for acquiring digital color-infrared photographs for monitoring vegetation| US7924504B2|2008-01-01|2011-04-12|United Microelectronics Corp.|Color filter structure having inorganic layers| CN101971006B|2008-03-21|2013-06-05|株式会社伊藤园|Method and apparatus of evaluating fitness-for-plucking of tea leaf, system of evaluating fitness-for-plucking of tea leaf| JP2010166851A|2009-01-22|2010-08-05|Chiharu Hongo|Method and device for predicting crop yield| US8248496B2|2009-03-10|2012-08-21|Canon Kabushiki Kaisha|Image processing apparatus, image processing method, and image sensor| WO2010116974A1|2009-04-07|2010-10-14|ローム株式会社|Photoelectric conversion device and image pickup device| WO2010144877A1|2009-06-11|2010-12-16|Petroalgae, Llc|Vegetation indices for measuring multilayer microcrop density and growth| JP5534300B2|2009-07-27|2014-06-25|株式会社サタケ|How to create a remote sensing calibration curve| JP2011199798A|2010-03-24|2011-10-06|Sony Corp|Physical information obtaining apparatus, solid-state imaging apparatus, and physical information obtaining method| JP5467903B2|2010-03-24|2014-04-09|株式会社日本総合研究所|Crop management method and crop management apparatus| US20110276336A1|2010-03-25|2011-11-10|Tracy Sweely|Method and apparatus for managing production complexity of high yield, multiple crop gardening and sustainable farming operations| CN102013021B|2010-08-19|2012-10-31|汪建|Tea tender shoot segmentation and identification method based on color and region growth| US8855937B2|2010-10-25|2014-10-07|Trimble Navigation Limited|Crop characteristic estimation| CN102521564A|2011-11-22|2012-06-27|常熟市董浜镇华进电器厂|Method for identifying tea leaves based on colors and shapes| CN102565061B|2012-01-16|2014-03-12|沈阳农业大学|Crop biomass nondestructive testing image acquisition and processing device and testing method|JPH0757863B2|1987-12-29|1995-06-21|日本合成化学工業株式会社|Pressure sensitive adhesive composition| EP3043310A4|2013-09-04|2017-03-01|Kubota Corporation|Agricultural assistance system| CN107529726B|2015-04-24|2020-08-04|索尼公司|Inspection apparatus, inspection method, and recording medium| JP2017009396A|2015-06-19|2017-01-12|パナソニックIpマネジメント株式会社|Image pickup apparatus| CN107709942B|2015-06-26|2021-06-04|索尼公司|Inspection apparatus, sensing apparatus, sensitivity control apparatus, inspection method, and program| EP3321661A4|2015-07-10|2019-01-16|Sony Corporation|Inspection device, inspection method, and program| KR101832189B1|2015-07-29|2018-02-26|야마하하쓰도키 가부시키가이샤|Abnormal image detecting apparatus, image processing system with abnormal image detecting apparatus and vehicle mounted with image processing system| US20170099476A1|2015-10-01|2017-04-06|Samsung Electronics Co., Ltd.|Photographing device and method of controlling the same| DE102015221085A1|2015-10-28|2017-05-04|Robert Bosch Gmbh|Method and information system for recognizing at least one plant planted in a field| CN105547360B|2015-12-16|2017-08-08|中国科学院地理科学与资源研究所|Crop canopies image-pickup method based on context aware| JP2017131116A|2016-01-25|2017-08-03|株式会社トプコン|Plant sensor apparatus| EP3435319A4|2016-03-25|2019-08-21|Nec Corporation|Information processing device, control method for information processing device, and recording medium having control program for information processing device recorded therein| ES2847200T3|2016-04-07|2021-08-02|Tobii Ab|Image sensor for human-computer interaction based on computer vision| EP3244343A1|2016-05-12|2017-11-15|Bayer Cropscience AG|Recognition of weed in a natural environment| EP3477279A4|2016-06-22|2019-05-22|Sony Corporation|Sensing system, sensing method, and sensing device| EP3748335A1|2016-08-17|2020-12-09|Sony Corporation|Signal processing apparatus, signal processing method, and program| FR3055410A1|2016-08-25|2018-03-02|Jaubert Agissant Au Nom Et Pour Le Compte De Ph J Sas En Cours De Formation Philippe|METHOD FOR INSTANTLY DETERMINING THE VEGETATIVE CONDITION OF AGRICULTURAL ZONES| EP3529771A1|2016-10-19|2019-08-28|BASF Agro Trademarks GmbH|Determining the grain weight of an ear| US10769436B2|2017-04-19|2020-09-08|Sentera, Inc.|Multiband filtering image collection and analysis| US11116155B2|2017-06-14|2021-09-14|Grow Solutions Tech Llc|Systems and methods for bypassing harvesting for a grow pod| US20180359975A1|2017-06-14|2018-12-20|Grow Solutions Tech Llc|Systems and methods for determining harvest timing for plant matter within a grow pod| WO2019003400A1|2017-06-29|2019-01-03|日本電気株式会社|Coefficient calculation device, coefficient calculation method, and recording medium in which coefficient calculation program is recorded| WO2019035306A1|2017-08-18|2019-02-21|コニカミノルタ株式会社|Plant growth index calculation method, plant growth index calculation program and plant growth index calculation system| CN110710199B|2017-08-22|2021-03-02|株式会社东芝|Smart camera, image processing apparatus, and data communication method| US11227382B2|2018-01-11|2022-01-18|Intelinair, Inc.|Change detection system| KR102106112B1|2018-06-12|2020-04-29|주식회사 팜에어|Apparatus for discriminating crops using drone image| JP2020012684A|2018-07-13|2020-01-23|ソニー株式会社|Information processor, method for processing information, and program| WO2020044628A1|2018-08-29|2020-03-05|コニカミノルタ株式会社|Cultivated land imaging system and cultivated land imaging method| KR102301658B1|2018-10-31|2021-09-14|카탈로닉스|Apparatus for growing plant and control method of the same| CN110717087A|2019-04-24|2020-01-21|中国科学院地理科学与资源研究所|Method and system for acquiring normalized vegetation index NDVI of designated position point| TWI726396B|2019-08-23|2021-05-01|經緯航太科技股份有限公司|Environmental inspection system and method| JP6774544B2|2019-10-18|2020-10-28|マクセルホールディングス株式会社|Plant information acquisition system, plant information acquisition device, plant information acquisition method, crop management system and crop management method| WO2021205468A1|2020-04-10|2021-10-14|Bhole Varsha|System and method for identifying fruit shelf life| JP2021006819A|2020-10-01|2021-01-21|マクセルホールディングス株式会社|Plant information acquisition system, plant information acquisition device, plant information acquisition method, crop management system, and crop management method|
法律状态:
2018-11-13| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2019-12-24| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-11-23| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2022-01-04| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 17/03/2014, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 JP2013-062017|2013-03-25| JP2013062017A|JP5950166B2|2013-03-25|2013-03-25|Information processing system, information processing method of image processing system, imaging apparatus, imaging method, and program| PCT/JP2014/001497|WO2014156039A1|2013-03-25|2014-03-17|Method, system, and medium having stored thereon instructions that cause a processor to execute a method for obtaining image information of an organism comprising a set of optical data| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|